00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 619 00:00:00.001 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3284 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.025 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.025 The recommended git tool is: git 00:00:00.026 using credential 00000000-0000-0000-0000-000000000002 00:00:00.027 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.040 Fetching changes from the remote Git repository 00:00:00.044 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.066 Using shallow fetch with depth 1 00:00:00.066 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.066 > git --version # timeout=10 00:00:00.109 > git --version # 'git version 2.39.2' 00:00:00.109 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.163 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.163 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.938 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.948 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.962 Checking out Revision 1c6ed56008363df82da0fcec030d6d5a1f7bd340 (FETCH_HEAD) 00:00:02.962 > git config core.sparsecheckout # timeout=10 00:00:02.973 > git read-tree -mu HEAD # timeout=10 00:00:02.988 > git checkout -f 1c6ed56008363df82da0fcec030d6d5a1f7bd340 # timeout=5 00:00:03.007 Commit message: "spdk-abi-per-patch: pass revision to subbuild" 00:00:03.007 > git rev-list --no-walk 1c6ed56008363df82da0fcec030d6d5a1f7bd340 # timeout=10 00:00:03.088 [Pipeline] Start of Pipeline 00:00:03.100 [Pipeline] library 00:00:03.101 Loading library shm_lib@master 00:00:03.102 Library shm_lib@master is cached. Copying from home. 00:00:03.115 [Pipeline] node 00:00:03.133 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.135 [Pipeline] { 00:00:03.145 [Pipeline] catchError 00:00:03.146 [Pipeline] { 00:00:03.156 [Pipeline] wrap 00:00:03.164 [Pipeline] { 00:00:03.186 [Pipeline] stage 00:00:03.188 [Pipeline] { (Prologue) 00:00:03.399 [Pipeline] sh 00:00:03.682 + logger -p user.info -t JENKINS-CI 00:00:03.696 [Pipeline] echo 00:00:03.697 Node: WFP20 00:00:03.703 [Pipeline] sh 00:00:03.994 [Pipeline] setCustomBuildProperty 00:00:04.006 [Pipeline] echo 00:00:04.008 Cleanup processes 00:00:04.013 [Pipeline] sh 00:00:04.296 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.296 2143061 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.338 [Pipeline] sh 00:00:04.619 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.620 ++ grep -v 'sudo pgrep' 00:00:04.620 ++ awk '{print $1}' 00:00:04.620 + sudo kill -9 00:00:04.620 + true 00:00:04.630 [Pipeline] cleanWs 00:00:04.637 [WS-CLEANUP] Deleting project workspace... 00:00:04.638 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.642 [WS-CLEANUP] done 00:00:04.646 [Pipeline] setCustomBuildProperty 00:00:04.656 [Pipeline] sh 00:00:04.966 + sudo git config --global --replace-all safe.directory '*' 00:00:05.031 [Pipeline] httpRequest 00:00:05.058 [Pipeline] echo 00:00:05.059 Sorcerer 10.211.164.101 is alive 00:00:05.065 [Pipeline] httpRequest 00:00:05.069 HttpMethod: GET 00:00:05.069 URL: http://10.211.164.101/packages/jbp_1c6ed56008363df82da0fcec030d6d5a1f7bd340.tar.gz 00:00:05.069 Sending request to url: http://10.211.164.101/packages/jbp_1c6ed56008363df82da0fcec030d6d5a1f7bd340.tar.gz 00:00:05.088 Response Code: HTTP/1.1 200 OK 00:00:05.089 Success: Status code 200 is in the accepted range: 200,404 00:00:05.089 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_1c6ed56008363df82da0fcec030d6d5a1f7bd340.tar.gz 00:00:25.603 [Pipeline] sh 00:00:25.885 + tar --no-same-owner -xf jbp_1c6ed56008363df82da0fcec030d6d5a1f7bd340.tar.gz 00:00:25.901 [Pipeline] httpRequest 00:00:25.918 [Pipeline] echo 00:00:25.919 Sorcerer 10.211.164.101 is alive 00:00:25.928 [Pipeline] httpRequest 00:00:25.933 HttpMethod: GET 00:00:25.933 URL: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:25.933 Sending request to url: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:25.944 Response Code: HTTP/1.1 200 OK 00:00:25.944 Success: Status code 200 is in the accepted range: 200,404 00:00:25.944 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:26.159 [Pipeline] sh 00:01:26.441 + tar --no-same-owner -xf spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:28.986 [Pipeline] sh 00:01:29.265 + git -C spdk log --oneline -n5 00:01:29.265 4b94202c6 lib/event: Bug fix for framework_set_scheduler 00:01:29.265 507e9ba07 nvme: add lock_depth for ctrlr_lock 00:01:29.265 62fda7b5f nvme: check pthread_mutex_destroy() return value 00:01:29.265 e03c164a1 nvme: add nvme_ctrlr_lock 00:01:29.265 d61f89a86 nvme/cuse: Add ctrlr_lock for cuse register and unregister 00:01:29.283 [Pipeline] withCredentials 00:01:29.295 > git --version # timeout=10 00:01:29.306 > git --version # 'git version 2.39.2' 00:01:29.322 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:29.325 [Pipeline] { 00:01:29.333 [Pipeline] retry 00:01:29.334 [Pipeline] { 00:01:29.350 [Pipeline] sh 00:01:29.631 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:29.902 [Pipeline] } 00:01:29.923 [Pipeline] // retry 00:01:29.926 [Pipeline] } 00:01:29.940 [Pipeline] // withCredentials 00:01:29.947 [Pipeline] httpRequest 00:01:29.959 [Pipeline] echo 00:01:29.960 Sorcerer 10.211.164.101 is alive 00:01:29.980 [Pipeline] httpRequest 00:01:29.984 HttpMethod: GET 00:01:29.984 URL: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:29.985 Sending request to url: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:29.986 Response Code: HTTP/1.1 200 OK 00:01:29.986 Success: Status code 200 is in the accepted range: 200,404 00:01:29.987 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:36.110 [Pipeline] sh 00:01:36.387 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:37.776 [Pipeline] sh 00:01:38.059 + git -C dpdk log --oneline -n5 00:01:38.059 caf0f5d395 version: 22.11.4 00:01:38.059 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:38.059 dc9c799c7d vhost: fix missing spinlock unlock 00:01:38.059 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:38.059 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:38.070 [Pipeline] } 00:01:38.087 [Pipeline] // stage 00:01:38.094 [Pipeline] stage 00:01:38.096 [Pipeline] { (Prepare) 00:01:38.115 [Pipeline] writeFile 00:01:38.129 [Pipeline] sh 00:01:38.407 + logger -p user.info -t JENKINS-CI 00:01:38.419 [Pipeline] sh 00:01:38.701 + logger -p user.info -t JENKINS-CI 00:01:38.714 [Pipeline] sh 00:01:38.996 + cat autorun-spdk.conf 00:01:38.996 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:38.996 SPDK_RUN_UBSAN=1 00:01:38.996 SPDK_TEST_FUZZER=1 00:01:38.996 SPDK_TEST_FUZZER_SHORT=1 00:01:38.996 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:38.996 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:39.002 RUN_NIGHTLY=1 00:01:39.008 [Pipeline] readFile 00:01:39.035 [Pipeline] withEnv 00:01:39.037 [Pipeline] { 00:01:39.050 [Pipeline] sh 00:01:39.331 + set -ex 00:01:39.331 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:39.331 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:39.331 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:39.331 ++ SPDK_RUN_UBSAN=1 00:01:39.331 ++ SPDK_TEST_FUZZER=1 00:01:39.331 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:39.331 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:39.331 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:39.331 ++ RUN_NIGHTLY=1 00:01:39.331 + case $SPDK_TEST_NVMF_NICS in 00:01:39.331 + DRIVERS= 00:01:39.331 + [[ -n '' ]] 00:01:39.331 + exit 0 00:01:39.340 [Pipeline] } 00:01:39.352 [Pipeline] // withEnv 00:01:39.357 [Pipeline] } 00:01:39.368 [Pipeline] // stage 00:01:39.378 [Pipeline] catchError 00:01:39.380 [Pipeline] { 00:01:39.391 [Pipeline] timeout 00:01:39.391 Timeout set to expire in 30 min 00:01:39.392 [Pipeline] { 00:01:39.403 [Pipeline] stage 00:01:39.404 [Pipeline] { (Tests) 00:01:39.415 [Pipeline] sh 00:01:39.693 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:39.693 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:39.693 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:39.693 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:39.693 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:39.693 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:39.693 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:39.693 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:39.693 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:39.693 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:39.693 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:39.693 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:39.693 + source /etc/os-release 00:01:39.693 ++ NAME='Fedora Linux' 00:01:39.693 ++ VERSION='38 (Cloud Edition)' 00:01:39.693 ++ ID=fedora 00:01:39.693 ++ VERSION_ID=38 00:01:39.693 ++ VERSION_CODENAME= 00:01:39.693 ++ PLATFORM_ID=platform:f38 00:01:39.693 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:39.693 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:39.693 ++ LOGO=fedora-logo-icon 00:01:39.693 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:39.693 ++ HOME_URL=https://fedoraproject.org/ 00:01:39.693 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:39.693 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:39.693 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:39.693 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:39.693 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:39.693 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:39.693 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:39.693 ++ SUPPORT_END=2024-05-14 00:01:39.693 ++ VARIANT='Cloud Edition' 00:01:39.693 ++ VARIANT_ID=cloud 00:01:39.693 + uname -a 00:01:39.693 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:39.693 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:42.984 Hugepages 00:01:42.984 node hugesize free / total 00:01:42.984 node0 1048576kB 0 / 0 00:01:42.984 node0 2048kB 0 / 0 00:01:42.984 node1 1048576kB 0 / 0 00:01:42.984 node1 2048kB 0 / 0 00:01:42.984 00:01:42.984 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:42.984 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:42.984 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:42.984 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:42.984 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:42.984 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:42.984 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:42.984 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:42.984 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:42.984 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:42.984 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:42.985 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:42.985 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:42.985 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:42.985 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:42.985 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:42.985 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:42.985 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:42.985 + rm -f /tmp/spdk-ld-path 00:01:42.985 + source autorun-spdk.conf 00:01:42.985 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:42.985 ++ SPDK_RUN_UBSAN=1 00:01:42.985 ++ SPDK_TEST_FUZZER=1 00:01:42.985 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:42.985 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:42.985 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.985 ++ RUN_NIGHTLY=1 00:01:42.985 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:42.985 + [[ -n '' ]] 00:01:42.985 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:42.985 + for M in /var/spdk/build-*-manifest.txt 00:01:42.985 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:42.985 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:42.985 + for M in /var/spdk/build-*-manifest.txt 00:01:42.985 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:42.985 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:42.985 ++ uname 00:01:42.985 + [[ Linux == \L\i\n\u\x ]] 00:01:42.985 + sudo dmesg -T 00:01:42.985 + sudo dmesg --clear 00:01:42.985 + dmesg_pid=2144537 00:01:42.985 + [[ Fedora Linux == FreeBSD ]] 00:01:42.985 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:42.985 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:42.985 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:42.985 + [[ -x /usr/src/fio-static/fio ]] 00:01:42.985 + sudo dmesg -Tw 00:01:42.985 + export FIO_BIN=/usr/src/fio-static/fio 00:01:42.985 + FIO_BIN=/usr/src/fio-static/fio 00:01:42.985 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:42.985 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:42.985 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:42.985 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:42.985 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:42.985 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:42.985 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:42.985 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:42.985 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:42.985 Test configuration: 00:01:42.985 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:42.985 SPDK_RUN_UBSAN=1 00:01:42.985 SPDK_TEST_FUZZER=1 00:01:42.985 SPDK_TEST_FUZZER_SHORT=1 00:01:42.985 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:42.985 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.985 RUN_NIGHTLY=1 16:10:11 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:42.985 16:10:11 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:42.985 16:10:11 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:42.985 16:10:11 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:42.985 16:10:11 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:42.985 16:10:11 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:42.985 16:10:11 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:42.985 16:10:11 -- paths/export.sh@5 -- $ export PATH 00:01:42.985 16:10:11 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:42.985 16:10:11 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:42.985 16:10:11 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:42.985 16:10:11 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1721484611.XXXXXX 00:01:42.985 16:10:11 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1721484611.2jFP8B 00:01:42.985 16:10:11 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@441 -- $ '[' -n v22.11.4 ']' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.985 16:10:11 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:42.985 16:10:11 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:42.985 16:10:11 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.985 16:10:11 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:42.985 16:10:11 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:42.985 16:10:11 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:42.985 16:10:11 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:42.985 16:10:11 -- spdk/autobuild.sh@16 -- $ date -u 00:01:42.985 Sat Jul 20 02:10:11 PM UTC 2024 00:01:42.985 16:10:11 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:42.985 LTS-59-g4b94202c6 00:01:42.985 16:10:11 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:42.985 16:10:11 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:42.985 16:10:11 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:42.985 16:10:11 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:42.985 16:10:11 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:42.985 16:10:11 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.985 ************************************ 00:01:42.985 START TEST ubsan 00:01:42.985 ************************************ 00:01:42.985 16:10:11 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:42.985 using ubsan 00:01:42.985 00:01:42.985 real 0m0.000s 00:01:42.985 user 0m0.000s 00:01:42.985 sys 0m0.000s 00:01:42.985 16:10:11 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:42.985 16:10:11 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.985 ************************************ 00:01:42.985 END TEST ubsan 00:01:42.985 ************************************ 00:01:42.985 16:10:11 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:42.985 16:10:11 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:42.985 16:10:11 -- common/autobuild_common.sh@427 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:42.985 16:10:11 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:42.985 16:10:11 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:42.985 16:10:11 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.985 ************************************ 00:01:42.985 START TEST build_native_dpdk 00:01:42.985 ************************************ 00:01:42.985 16:10:11 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:01:42.985 16:10:11 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:42.985 16:10:11 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:42.985 16:10:11 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:42.985 16:10:11 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:42.985 16:10:11 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:42.985 16:10:11 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:42.985 16:10:11 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:42.985 16:10:11 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:42.985 16:10:11 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:42.985 16:10:11 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:42.985 16:10:11 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:42.985 16:10:11 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:42.985 16:10:11 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.985 16:10:11 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.985 16:10:11 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:42.985 16:10:11 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:42.985 16:10:11 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:42.985 caf0f5d395 version: 22.11.4 00:01:42.985 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:42.985 dc9c799c7d vhost: fix missing spinlock unlock 00:01:42.985 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:42.985 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:42.985 16:10:11 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:42.985 16:10:11 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:42.985 16:10:11 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:42.985 16:10:11 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:42.985 16:10:11 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:42.985 16:10:11 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:42.985 16:10:11 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:42.985 16:10:11 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:42.985 16:10:11 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:42.985 16:10:11 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:42.985 16:10:11 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:42.985 16:10:11 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:42.985 16:10:11 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:42.985 16:10:11 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:42.985 16:10:11 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:42.985 16:10:11 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:42.985 16:10:11 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:42.985 16:10:11 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:42.985 16:10:11 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:42.985 16:10:11 -- scripts/common.sh@343 -- $ case "$op" in 00:01:42.985 16:10:11 -- scripts/common.sh@344 -- $ : 1 00:01:42.985 16:10:11 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:42.985 16:10:11 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:42.985 16:10:11 -- scripts/common.sh@364 -- $ decimal 22 00:01:42.985 16:10:11 -- scripts/common.sh@352 -- $ local d=22 00:01:42.985 16:10:11 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:42.985 16:10:11 -- scripts/common.sh@354 -- $ echo 22 00:01:42.985 16:10:11 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:42.985 16:10:11 -- scripts/common.sh@365 -- $ decimal 21 00:01:42.985 16:10:11 -- scripts/common.sh@352 -- $ local d=21 00:01:42.985 16:10:11 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:42.985 16:10:11 -- scripts/common.sh@354 -- $ echo 21 00:01:42.985 16:10:11 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:42.985 16:10:11 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:42.985 16:10:11 -- scripts/common.sh@366 -- $ return 1 00:01:42.985 16:10:11 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:42.985 patching file config/rte_config.h 00:01:42.985 Hunk #1 succeeded at 60 (offset 1 line). 00:01:42.985 16:10:11 -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:01:42.985 16:10:11 -- common/autobuild_common.sh@178 -- $ uname -s 00:01:42.985 16:10:11 -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:01:42.985 16:10:11 -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:42.985 16:10:11 -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:48.290 The Meson build system 00:01:48.290 Version: 1.3.1 00:01:48.290 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:48.290 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:48.290 Build type: native build 00:01:48.290 Program cat found: YES (/usr/bin/cat) 00:01:48.290 Project name: DPDK 00:01:48.290 Project version: 22.11.4 00:01:48.290 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:48.290 C linker for the host machine: gcc ld.bfd 2.39-16 00:01:48.290 Host machine cpu family: x86_64 00:01:48.290 Host machine cpu: x86_64 00:01:48.290 Message: ## Building in Developer Mode ## 00:01:48.290 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:48.290 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:48.290 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:48.290 Program objdump found: YES (/usr/bin/objdump) 00:01:48.290 Program python3 found: YES (/usr/bin/python3) 00:01:48.290 Program cat found: YES (/usr/bin/cat) 00:01:48.290 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:48.290 Checking for size of "void *" : 8 00:01:48.290 Checking for size of "void *" : 8 (cached) 00:01:48.290 Library m found: YES 00:01:48.290 Library numa found: YES 00:01:48.290 Has header "numaif.h" : YES 00:01:48.290 Library fdt found: NO 00:01:48.290 Library execinfo found: NO 00:01:48.290 Has header "execinfo.h" : YES 00:01:48.290 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:48.290 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:48.290 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:48.290 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:48.290 Run-time dependency openssl found: YES 3.0.9 00:01:48.290 Run-time dependency libpcap found: YES 1.10.4 00:01:48.290 Has header "pcap.h" with dependency libpcap: YES 00:01:48.290 Compiler for C supports arguments -Wcast-qual: YES 00:01:48.290 Compiler for C supports arguments -Wdeprecated: YES 00:01:48.290 Compiler for C supports arguments -Wformat: YES 00:01:48.290 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:48.290 Compiler for C supports arguments -Wformat-security: NO 00:01:48.290 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:48.290 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:48.290 Compiler for C supports arguments -Wnested-externs: YES 00:01:48.290 Compiler for C supports arguments -Wold-style-definition: YES 00:01:48.290 Compiler for C supports arguments -Wpointer-arith: YES 00:01:48.290 Compiler for C supports arguments -Wsign-compare: YES 00:01:48.290 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:48.290 Compiler for C supports arguments -Wundef: YES 00:01:48.290 Compiler for C supports arguments -Wwrite-strings: YES 00:01:48.290 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:48.290 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:48.290 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:48.290 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:48.290 Compiler for C supports arguments -mavx512f: YES 00:01:48.290 Checking if "AVX512 checking" compiles: YES 00:01:48.290 Fetching value of define "__SSE4_2__" : 1 00:01:48.290 Fetching value of define "__AES__" : 1 00:01:48.290 Fetching value of define "__AVX__" : 1 00:01:48.290 Fetching value of define "__AVX2__" : 1 00:01:48.290 Fetching value of define "__AVX512BW__" : 1 00:01:48.290 Fetching value of define "__AVX512CD__" : 1 00:01:48.290 Fetching value of define "__AVX512DQ__" : 1 00:01:48.290 Fetching value of define "__AVX512F__" : 1 00:01:48.290 Fetching value of define "__AVX512VL__" : 1 00:01:48.290 Fetching value of define "__PCLMUL__" : 1 00:01:48.290 Fetching value of define "__RDRND__" : 1 00:01:48.290 Fetching value of define "__RDSEED__" : 1 00:01:48.290 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:48.290 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:48.290 Message: lib/kvargs: Defining dependency "kvargs" 00:01:48.290 Message: lib/telemetry: Defining dependency "telemetry" 00:01:48.290 Checking for function "getentropy" : YES 00:01:48.290 Message: lib/eal: Defining dependency "eal" 00:01:48.290 Message: lib/ring: Defining dependency "ring" 00:01:48.290 Message: lib/rcu: Defining dependency "rcu" 00:01:48.290 Message: lib/mempool: Defining dependency "mempool" 00:01:48.290 Message: lib/mbuf: Defining dependency "mbuf" 00:01:48.290 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:48.290 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:48.290 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:48.290 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:48.290 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:48.290 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:48.290 Compiler for C supports arguments -mpclmul: YES 00:01:48.291 Compiler for C supports arguments -maes: YES 00:01:48.291 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:48.291 Compiler for C supports arguments -mavx512bw: YES 00:01:48.291 Compiler for C supports arguments -mavx512dq: YES 00:01:48.291 Compiler for C supports arguments -mavx512vl: YES 00:01:48.291 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:48.291 Compiler for C supports arguments -mavx2: YES 00:01:48.291 Compiler for C supports arguments -mavx: YES 00:01:48.291 Message: lib/net: Defining dependency "net" 00:01:48.291 Message: lib/meter: Defining dependency "meter" 00:01:48.291 Message: lib/ethdev: Defining dependency "ethdev" 00:01:48.291 Message: lib/pci: Defining dependency "pci" 00:01:48.291 Message: lib/cmdline: Defining dependency "cmdline" 00:01:48.291 Message: lib/metrics: Defining dependency "metrics" 00:01:48.291 Message: lib/hash: Defining dependency "hash" 00:01:48.291 Message: lib/timer: Defining dependency "timer" 00:01:48.291 Fetching value of define "__AVX2__" : 1 (cached) 00:01:48.291 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:48.291 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:48.291 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:48.291 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:48.291 Message: lib/acl: Defining dependency "acl" 00:01:48.291 Message: lib/bbdev: Defining dependency "bbdev" 00:01:48.291 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:48.291 Run-time dependency libelf found: YES 0.190 00:01:48.291 Message: lib/bpf: Defining dependency "bpf" 00:01:48.291 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:48.291 Message: lib/compressdev: Defining dependency "compressdev" 00:01:48.291 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:48.291 Message: lib/distributor: Defining dependency "distributor" 00:01:48.291 Message: lib/efd: Defining dependency "efd" 00:01:48.291 Message: lib/eventdev: Defining dependency "eventdev" 00:01:48.291 Message: lib/gpudev: Defining dependency "gpudev" 00:01:48.291 Message: lib/gro: Defining dependency "gro" 00:01:48.291 Message: lib/gso: Defining dependency "gso" 00:01:48.291 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:48.291 Message: lib/jobstats: Defining dependency "jobstats" 00:01:48.291 Message: lib/latencystats: Defining dependency "latencystats" 00:01:48.291 Message: lib/lpm: Defining dependency "lpm" 00:01:48.291 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:48.291 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:48.291 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:48.291 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:48.291 Message: lib/member: Defining dependency "member" 00:01:48.291 Message: lib/pcapng: Defining dependency "pcapng" 00:01:48.291 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:48.291 Message: lib/power: Defining dependency "power" 00:01:48.291 Message: lib/rawdev: Defining dependency "rawdev" 00:01:48.291 Message: lib/regexdev: Defining dependency "regexdev" 00:01:48.291 Message: lib/dmadev: Defining dependency "dmadev" 00:01:48.291 Message: lib/rib: Defining dependency "rib" 00:01:48.291 Message: lib/reorder: Defining dependency "reorder" 00:01:48.291 Message: lib/sched: Defining dependency "sched" 00:01:48.291 Message: lib/security: Defining dependency "security" 00:01:48.291 Message: lib/stack: Defining dependency "stack" 00:01:48.291 Has header "linux/userfaultfd.h" : YES 00:01:48.291 Message: lib/vhost: Defining dependency "vhost" 00:01:48.291 Message: lib/ipsec: Defining dependency "ipsec" 00:01:48.291 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:48.291 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:48.291 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:48.291 Message: lib/fib: Defining dependency "fib" 00:01:48.291 Message: lib/port: Defining dependency "port" 00:01:48.291 Message: lib/pdump: Defining dependency "pdump" 00:01:48.291 Message: lib/table: Defining dependency "table" 00:01:48.291 Message: lib/pipeline: Defining dependency "pipeline" 00:01:48.291 Message: lib/graph: Defining dependency "graph" 00:01:48.291 Message: lib/node: Defining dependency "node" 00:01:48.291 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:48.291 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:48.291 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:48.291 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:48.291 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:48.291 Compiler for C supports arguments -Wno-unused-value: YES 00:01:48.291 Compiler for C supports arguments -Wno-format: YES 00:01:48.291 Compiler for C supports arguments -Wno-format-security: YES 00:01:48.291 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:48.550 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:48.550 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:48.550 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:48.550 Fetching value of define "__AVX2__" : 1 (cached) 00:01:48.550 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:48.550 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:48.550 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:48.550 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:48.550 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:48.550 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:48.550 Program doxygen found: YES (/usr/bin/doxygen) 00:01:48.550 Configuring doxy-api.conf using configuration 00:01:48.550 Program sphinx-build found: NO 00:01:48.550 Configuring rte_build_config.h using configuration 00:01:48.550 Message: 00:01:48.550 ================= 00:01:48.550 Applications Enabled 00:01:48.550 ================= 00:01:48.550 00:01:48.550 apps: 00:01:48.550 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:01:48.550 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:01:48.550 test-security-perf, 00:01:48.550 00:01:48.550 Message: 00:01:48.550 ================= 00:01:48.550 Libraries Enabled 00:01:48.550 ================= 00:01:48.550 00:01:48.550 libs: 00:01:48.550 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:01:48.550 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:01:48.550 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:01:48.550 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:01:48.550 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:01:48.550 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:01:48.550 table, pipeline, graph, node, 00:01:48.550 00:01:48.550 Message: 00:01:48.550 =============== 00:01:48.550 Drivers Enabled 00:01:48.550 =============== 00:01:48.550 00:01:48.550 common: 00:01:48.550 00:01:48.550 bus: 00:01:48.550 pci, vdev, 00:01:48.550 mempool: 00:01:48.550 ring, 00:01:48.550 dma: 00:01:48.550 00:01:48.550 net: 00:01:48.550 i40e, 00:01:48.550 raw: 00:01:48.550 00:01:48.550 crypto: 00:01:48.550 00:01:48.550 compress: 00:01:48.550 00:01:48.550 regex: 00:01:48.550 00:01:48.550 vdpa: 00:01:48.550 00:01:48.550 event: 00:01:48.550 00:01:48.550 baseband: 00:01:48.550 00:01:48.550 gpu: 00:01:48.550 00:01:48.550 00:01:48.550 Message: 00:01:48.550 ================= 00:01:48.550 Content Skipped 00:01:48.550 ================= 00:01:48.550 00:01:48.550 apps: 00:01:48.550 00:01:48.550 libs: 00:01:48.550 kni: explicitly disabled via build config (deprecated lib) 00:01:48.550 flow_classify: explicitly disabled via build config (deprecated lib) 00:01:48.550 00:01:48.550 drivers: 00:01:48.550 common/cpt: not in enabled drivers build config 00:01:48.550 common/dpaax: not in enabled drivers build config 00:01:48.550 common/iavf: not in enabled drivers build config 00:01:48.550 common/idpf: not in enabled drivers build config 00:01:48.550 common/mvep: not in enabled drivers build config 00:01:48.550 common/octeontx: not in enabled drivers build config 00:01:48.550 bus/auxiliary: not in enabled drivers build config 00:01:48.550 bus/dpaa: not in enabled drivers build config 00:01:48.550 bus/fslmc: not in enabled drivers build config 00:01:48.550 bus/ifpga: not in enabled drivers build config 00:01:48.550 bus/vmbus: not in enabled drivers build config 00:01:48.550 common/cnxk: not in enabled drivers build config 00:01:48.550 common/mlx5: not in enabled drivers build config 00:01:48.550 common/qat: not in enabled drivers build config 00:01:48.550 common/sfc_efx: not in enabled drivers build config 00:01:48.550 mempool/bucket: not in enabled drivers build config 00:01:48.550 mempool/cnxk: not in enabled drivers build config 00:01:48.550 mempool/dpaa: not in enabled drivers build config 00:01:48.550 mempool/dpaa2: not in enabled drivers build config 00:01:48.550 mempool/octeontx: not in enabled drivers build config 00:01:48.550 mempool/stack: not in enabled drivers build config 00:01:48.550 dma/cnxk: not in enabled drivers build config 00:01:48.550 dma/dpaa: not in enabled drivers build config 00:01:48.550 dma/dpaa2: not in enabled drivers build config 00:01:48.550 dma/hisilicon: not in enabled drivers build config 00:01:48.550 dma/idxd: not in enabled drivers build config 00:01:48.550 dma/ioat: not in enabled drivers build config 00:01:48.550 dma/skeleton: not in enabled drivers build config 00:01:48.550 net/af_packet: not in enabled drivers build config 00:01:48.550 net/af_xdp: not in enabled drivers build config 00:01:48.550 net/ark: not in enabled drivers build config 00:01:48.550 net/atlantic: not in enabled drivers build config 00:01:48.550 net/avp: not in enabled drivers build config 00:01:48.550 net/axgbe: not in enabled drivers build config 00:01:48.550 net/bnx2x: not in enabled drivers build config 00:01:48.550 net/bnxt: not in enabled drivers build config 00:01:48.550 net/bonding: not in enabled drivers build config 00:01:48.550 net/cnxk: not in enabled drivers build config 00:01:48.550 net/cxgbe: not in enabled drivers build config 00:01:48.550 net/dpaa: not in enabled drivers build config 00:01:48.550 net/dpaa2: not in enabled drivers build config 00:01:48.550 net/e1000: not in enabled drivers build config 00:01:48.550 net/ena: not in enabled drivers build config 00:01:48.550 net/enetc: not in enabled drivers build config 00:01:48.550 net/enetfec: not in enabled drivers build config 00:01:48.550 net/enic: not in enabled drivers build config 00:01:48.550 net/failsafe: not in enabled drivers build config 00:01:48.550 net/fm10k: not in enabled drivers build config 00:01:48.550 net/gve: not in enabled drivers build config 00:01:48.550 net/hinic: not in enabled drivers build config 00:01:48.550 net/hns3: not in enabled drivers build config 00:01:48.550 net/iavf: not in enabled drivers build config 00:01:48.550 net/ice: not in enabled drivers build config 00:01:48.550 net/idpf: not in enabled drivers build config 00:01:48.550 net/igc: not in enabled drivers build config 00:01:48.550 net/ionic: not in enabled drivers build config 00:01:48.550 net/ipn3ke: not in enabled drivers build config 00:01:48.550 net/ixgbe: not in enabled drivers build config 00:01:48.550 net/kni: not in enabled drivers build config 00:01:48.550 net/liquidio: not in enabled drivers build config 00:01:48.550 net/mana: not in enabled drivers build config 00:01:48.550 net/memif: not in enabled drivers build config 00:01:48.550 net/mlx4: not in enabled drivers build config 00:01:48.550 net/mlx5: not in enabled drivers build config 00:01:48.550 net/mvneta: not in enabled drivers build config 00:01:48.550 net/mvpp2: not in enabled drivers build config 00:01:48.550 net/netvsc: not in enabled drivers build config 00:01:48.550 net/nfb: not in enabled drivers build config 00:01:48.550 net/nfp: not in enabled drivers build config 00:01:48.550 net/ngbe: not in enabled drivers build config 00:01:48.550 net/null: not in enabled drivers build config 00:01:48.550 net/octeontx: not in enabled drivers build config 00:01:48.550 net/octeon_ep: not in enabled drivers build config 00:01:48.550 net/pcap: not in enabled drivers build config 00:01:48.550 net/pfe: not in enabled drivers build config 00:01:48.550 net/qede: not in enabled drivers build config 00:01:48.550 net/ring: not in enabled drivers build config 00:01:48.550 net/sfc: not in enabled drivers build config 00:01:48.550 net/softnic: not in enabled drivers build config 00:01:48.550 net/tap: not in enabled drivers build config 00:01:48.550 net/thunderx: not in enabled drivers build config 00:01:48.550 net/txgbe: not in enabled drivers build config 00:01:48.550 net/vdev_netvsc: not in enabled drivers build config 00:01:48.550 net/vhost: not in enabled drivers build config 00:01:48.550 net/virtio: not in enabled drivers build config 00:01:48.550 net/vmxnet3: not in enabled drivers build config 00:01:48.550 raw/cnxk_bphy: not in enabled drivers build config 00:01:48.550 raw/cnxk_gpio: not in enabled drivers build config 00:01:48.550 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:48.550 raw/ifpga: not in enabled drivers build config 00:01:48.550 raw/ntb: not in enabled drivers build config 00:01:48.550 raw/skeleton: not in enabled drivers build config 00:01:48.550 crypto/armv8: not in enabled drivers build config 00:01:48.550 crypto/bcmfs: not in enabled drivers build config 00:01:48.550 crypto/caam_jr: not in enabled drivers build config 00:01:48.550 crypto/ccp: not in enabled drivers build config 00:01:48.550 crypto/cnxk: not in enabled drivers build config 00:01:48.550 crypto/dpaa_sec: not in enabled drivers build config 00:01:48.550 crypto/dpaa2_sec: not in enabled drivers build config 00:01:48.550 crypto/ipsec_mb: not in enabled drivers build config 00:01:48.550 crypto/mlx5: not in enabled drivers build config 00:01:48.550 crypto/mvsam: not in enabled drivers build config 00:01:48.551 crypto/nitrox: not in enabled drivers build config 00:01:48.551 crypto/null: not in enabled drivers build config 00:01:48.551 crypto/octeontx: not in enabled drivers build config 00:01:48.551 crypto/openssl: not in enabled drivers build config 00:01:48.551 crypto/scheduler: not in enabled drivers build config 00:01:48.551 crypto/uadk: not in enabled drivers build config 00:01:48.551 crypto/virtio: not in enabled drivers build config 00:01:48.551 compress/isal: not in enabled drivers build config 00:01:48.551 compress/mlx5: not in enabled drivers build config 00:01:48.551 compress/octeontx: not in enabled drivers build config 00:01:48.551 compress/zlib: not in enabled drivers build config 00:01:48.551 regex/mlx5: not in enabled drivers build config 00:01:48.551 regex/cn9k: not in enabled drivers build config 00:01:48.551 vdpa/ifc: not in enabled drivers build config 00:01:48.551 vdpa/mlx5: not in enabled drivers build config 00:01:48.551 vdpa/sfc: not in enabled drivers build config 00:01:48.551 event/cnxk: not in enabled drivers build config 00:01:48.551 event/dlb2: not in enabled drivers build config 00:01:48.551 event/dpaa: not in enabled drivers build config 00:01:48.551 event/dpaa2: not in enabled drivers build config 00:01:48.551 event/dsw: not in enabled drivers build config 00:01:48.551 event/opdl: not in enabled drivers build config 00:01:48.551 event/skeleton: not in enabled drivers build config 00:01:48.551 event/sw: not in enabled drivers build config 00:01:48.551 event/octeontx: not in enabled drivers build config 00:01:48.551 baseband/acc: not in enabled drivers build config 00:01:48.551 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:48.551 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:48.551 baseband/la12xx: not in enabled drivers build config 00:01:48.551 baseband/null: not in enabled drivers build config 00:01:48.551 baseband/turbo_sw: not in enabled drivers build config 00:01:48.551 gpu/cuda: not in enabled drivers build config 00:01:48.551 00:01:48.551 00:01:48.551 Build targets in project: 311 00:01:48.551 00:01:48.551 DPDK 22.11.4 00:01:48.551 00:01:48.551 User defined options 00:01:48.551 libdir : lib 00:01:48.551 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.551 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:48.551 c_link_args : 00:01:48.551 enable_docs : false 00:01:48.551 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:48.551 enable_kmods : false 00:01:48.551 machine : native 00:01:48.551 tests : false 00:01:48.551 00:01:48.551 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:48.551 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:48.551 16:10:17 -- common/autobuild_common.sh@186 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:48.551 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:48.813 [1/740] Generating lib/rte_kvargs_def with a custom command 00:01:48.813 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:01:48.813 [3/740] Generating lib/rte_telemetry_def with a custom command 00:01:48.813 [4/740] Generating lib/rte_telemetry_mingw with a custom command 00:01:48.813 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:48.813 [6/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:48.813 [7/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:48.813 [8/740] Generating lib/rte_eal_def with a custom command 00:01:48.813 [9/740] Generating lib/rte_ring_mingw with a custom command 00:01:48.813 [10/740] Generating lib/rte_rcu_def with a custom command 00:01:48.813 [11/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:48.813 [12/740] Generating lib/rte_ring_def with a custom command 00:01:48.813 [13/740] Generating lib/rte_mbuf_def with a custom command 00:01:48.813 [14/740] Generating lib/rte_eal_mingw with a custom command 00:01:48.813 [15/740] Generating lib/rte_rcu_mingw with a custom command 00:01:48.813 [16/740] Generating lib/rte_mempool_def with a custom command 00:01:48.813 [17/740] Generating lib/rte_mempool_mingw with a custom command 00:01:48.813 [18/740] Generating lib/rte_mbuf_mingw with a custom command 00:01:48.813 [19/740] Generating lib/rte_meter_def with a custom command 00:01:48.813 [20/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:48.813 [21/740] Generating lib/rte_net_def with a custom command 00:01:48.813 [22/740] Generating lib/rte_net_mingw with a custom command 00:01:48.813 [23/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:48.813 [24/740] Generating lib/rte_meter_mingw with a custom command 00:01:48.813 [25/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:48.813 [26/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:48.813 [27/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:48.813 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:48.813 [29/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:01:48.813 [30/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:49.072 [31/740] Generating lib/rte_ethdev_mingw with a custom command 00:01:49.072 [32/740] Generating lib/rte_ethdev_def with a custom command 00:01:49.072 [33/740] Generating lib/rte_pci_mingw with a custom command 00:01:49.072 [34/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:49.072 [35/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:49.072 [36/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:49.072 [37/740] Generating lib/rte_pci_def with a custom command 00:01:49.072 [38/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:49.072 [39/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:49.072 [40/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:49.072 [41/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:49.072 [42/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:49.072 [43/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:49.072 [44/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:49.072 [45/740] Linking static target lib/librte_kvargs.a 00:01:49.072 [46/740] Generating lib/rte_cmdline_def with a custom command 00:01:49.072 [47/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:49.072 [48/740] Generating lib/rte_cmdline_mingw with a custom command 00:01:49.072 [49/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:49.072 [50/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:49.072 [51/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:49.072 [52/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:49.072 [53/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:49.072 [54/740] Generating lib/rte_metrics_def with a custom command 00:01:49.072 [55/740] Generating lib/rte_metrics_mingw with a custom command 00:01:49.072 [56/740] Generating lib/rte_hash_mingw with a custom command 00:01:49.072 [57/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:49.072 [58/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:49.072 [59/740] Generating lib/rte_timer_def with a custom command 00:01:49.072 [60/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:49.072 [61/740] Generating lib/rte_hash_def with a custom command 00:01:49.072 [62/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:49.072 [63/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:49.072 [64/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:49.072 [65/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:49.072 [66/740] Generating lib/rte_timer_mingw with a custom command 00:01:49.072 [67/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:49.072 [68/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:49.072 [69/740] Generating lib/rte_acl_mingw with a custom command 00:01:49.072 [70/740] Generating lib/rte_bbdev_def with a custom command 00:01:49.072 [71/740] Generating lib/rte_acl_def with a custom command 00:01:49.072 [72/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:49.072 [73/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:49.072 [74/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:49.072 [75/740] Generating lib/rte_bbdev_mingw with a custom command 00:01:49.072 [76/740] Generating lib/rte_bitratestats_mingw with a custom command 00:01:49.072 [77/740] Generating lib/rte_bitratestats_def with a custom command 00:01:49.072 [78/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:49.072 [79/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:49.072 [80/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:49.072 [81/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:49.072 [82/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:49.072 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:49.072 [84/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:49.072 [85/740] Generating lib/rte_bpf_mingw with a custom command 00:01:49.072 [86/740] Linking static target lib/librte_pci.a 00:01:49.072 [87/740] Generating lib/rte_bpf_def with a custom command 00:01:49.072 [88/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:49.072 [89/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:49.072 [90/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:49.072 [91/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:49.072 [92/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:49.072 [93/740] Generating lib/rte_cfgfile_def with a custom command 00:01:49.072 [94/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:49.072 [95/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:49.072 [96/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:49.072 [97/740] Generating lib/rte_cfgfile_mingw with a custom command 00:01:49.072 [98/740] Generating lib/rte_compressdev_mingw with a custom command 00:01:49.072 [99/740] Linking static target lib/librte_ring.a 00:01:49.072 [100/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:49.072 [101/740] Generating lib/rte_compressdev_def with a custom command 00:01:49.072 [102/740] Linking static target lib/librte_meter.a 00:01:49.072 [103/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:01:49.072 [104/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:49.072 [105/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:49.072 [106/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:49.072 [107/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:49.072 [108/740] Generating lib/rte_cryptodev_def with a custom command 00:01:49.072 [109/740] Generating lib/rte_cryptodev_mingw with a custom command 00:01:49.072 [110/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:49.072 [111/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:49.072 [112/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:49.072 [113/740] Generating lib/rte_distributor_def with a custom command 00:01:49.072 [114/740] Generating lib/rte_distributor_mingw with a custom command 00:01:49.072 [115/740] Generating lib/rte_efd_mingw with a custom command 00:01:49.072 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:49.072 [117/740] Generating lib/rte_efd_def with a custom command 00:01:49.073 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:49.073 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:49.335 [120/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:49.335 [121/740] Generating lib/rte_eventdev_mingw with a custom command 00:01:49.335 [122/740] Generating lib/rte_eventdev_def with a custom command 00:01:49.335 [123/740] Generating lib/rte_gpudev_def with a custom command 00:01:49.335 [124/740] Generating lib/rte_gpudev_mingw with a custom command 00:01:49.335 [125/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:49.335 [126/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:49.335 [127/740] Generating lib/rte_gro_def with a custom command 00:01:49.335 [128/740] Generating lib/rte_gro_mingw with a custom command 00:01:49.335 [129/740] Generating lib/rte_gso_def with a custom command 00:01:49.335 [130/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:49.335 [131/740] Generating lib/rte_gso_mingw with a custom command 00:01:49.335 [132/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:49.335 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:49.335 [134/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:49.335 [135/740] Generating lib/rte_ip_frag_def with a custom command 00:01:49.335 [136/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.335 [137/740] Generating lib/rte_ip_frag_mingw with a custom command 00:01:49.335 [138/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.335 [139/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:49.596 [140/740] Generating lib/rte_jobstats_mingw with a custom command 00:01:49.596 [141/740] Linking target lib/librte_kvargs.so.23.0 00:01:49.596 [142/740] Generating lib/rte_jobstats_def with a custom command 00:01:49.596 [143/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:49.596 [144/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:49.596 [145/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.596 [146/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:49.596 [147/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:49.596 [148/740] Linking static target lib/librte_cfgfile.a 00:01:49.596 [149/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:49.596 [150/740] Generating lib/rte_latencystats_def with a custom command 00:01:49.596 [151/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:49.596 [152/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:49.596 [153/740] Generating lib/rte_latencystats_mingw with a custom command 00:01:49.596 [154/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:49.596 [155/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:49.596 [156/740] Generating lib/rte_lpm_mingw with a custom command 00:01:49.596 [157/740] Generating lib/rte_lpm_def with a custom command 00:01:49.596 [158/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:49.596 [159/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:49.596 [160/740] Generating lib/rte_member_def with a custom command 00:01:49.596 [161/740] Generating lib/rte_member_mingw with a custom command 00:01:49.596 [162/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:49.596 [163/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:49.596 [164/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.596 [165/740] Generating lib/rte_pcapng_def with a custom command 00:01:49.596 [166/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:49.596 [167/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:49.596 [168/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:49.596 [169/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:49.596 [170/740] Linking static target lib/librte_jobstats.a 00:01:49.596 [171/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:49.596 [172/740] Generating lib/rte_pcapng_mingw with a custom command 00:01:49.596 [173/740] Linking static target lib/librte_cmdline.a 00:01:49.596 [174/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:49.596 [175/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:49.596 [176/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:49.596 [177/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:49.596 [178/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:49.596 [179/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:49.596 [180/740] Generating lib/rte_power_def with a custom command 00:01:49.596 [181/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:01:49.596 [182/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:49.596 [183/740] Linking static target lib/librte_timer.a 00:01:49.596 [184/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:49.596 [185/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:49.596 [186/740] Generating lib/rte_power_mingw with a custom command 00:01:49.596 [187/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:49.596 [188/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:49.596 [189/740] Linking static target lib/librte_telemetry.a 00:01:49.596 [190/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:49.596 [191/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:49.596 [192/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:49.596 [193/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:49.596 [194/740] Generating lib/rte_rawdev_def with a custom command 00:01:49.596 [195/740] Generating lib/rte_regexdev_def with a custom command 00:01:49.596 [196/740] Linking static target lib/librte_metrics.a 00:01:49.596 [197/740] Generating lib/rte_rawdev_mingw with a custom command 00:01:49.858 [198/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:49.859 [199/740] Generating lib/rte_dmadev_def with a custom command 00:01:49.859 [200/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:49.859 [201/740] Generating lib/rte_regexdev_mingw with a custom command 00:01:49.859 [202/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:49.859 [203/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:49.859 [204/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:49.859 [205/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:49.859 [206/740] Generating lib/rte_dmadev_mingw with a custom command 00:01:49.859 [207/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:49.859 [208/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:49.859 [209/740] Generating lib/rte_rib_mingw with a custom command 00:01:49.859 [210/740] Generating lib/rte_rib_def with a custom command 00:01:49.859 [211/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:49.859 [212/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:49.859 [213/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:49.859 [214/740] Generating lib/rte_reorder_def with a custom command 00:01:49.859 [215/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:49.859 [216/740] Generating lib/rte_reorder_mingw with a custom command 00:01:49.859 [217/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:49.859 [218/740] Generating lib/rte_sched_mingw with a custom command 00:01:49.859 [219/740] Generating lib/rte_sched_def with a custom command 00:01:49.859 [220/740] Generating lib/rte_security_def with a custom command 00:01:49.859 [221/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:49.859 [222/740] Generating lib/rte_security_mingw with a custom command 00:01:49.859 [223/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:49.859 [224/740] Linking static target lib/librte_bitratestats.a 00:01:49.859 [225/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:49.859 [226/740] Generating lib/rte_stack_def with a custom command 00:01:49.859 [227/740] Generating lib/rte_stack_mingw with a custom command 00:01:49.859 [228/740] Linking static target lib/librte_net.a 00:01:49.859 [229/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:49.859 [230/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:49.859 [231/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:49.859 [232/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:49.859 [233/740] Generating lib/rte_vhost_def with a custom command 00:01:49.859 [234/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:49.859 [235/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:49.859 [236/740] Generating lib/rte_vhost_mingw with a custom command 00:01:49.859 [237/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:49.859 [238/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:49.859 [239/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:49.859 [240/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:49.859 [241/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:49.859 [242/740] Generating lib/rte_ipsec_def with a custom command 00:01:49.859 [243/740] Generating lib/rte_ipsec_mingw with a custom command 00:01:49.859 [244/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:49.859 [245/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:49.859 [246/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:49.859 [247/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:01:49.859 [248/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:49.859 [249/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:49.859 [250/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:49.859 [251/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:49.859 [252/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:49.859 [253/740] Linking static target lib/librte_stack.a 00:01:49.859 [254/740] Generating lib/rte_fib_mingw with a custom command 00:01:49.859 [255/740] Generating lib/rte_fib_def with a custom command 00:01:49.859 [256/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:49.859 [257/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:49.859 [258/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:49.859 [259/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:50.124 [260/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:50.124 [261/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:50.124 [262/740] Generating lib/rte_port_def with a custom command 00:01:50.124 [263/740] Generating lib/rte_port_mingw with a custom command 00:01:50.124 [264/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:50.124 [265/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:50.124 [266/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:50.124 [267/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.124 [268/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:50.124 [269/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:50.124 [270/740] Generating lib/rte_pdump_mingw with a custom command 00:01:50.124 [271/740] Generating lib/rte_pdump_def with a custom command 00:01:50.124 [272/740] Linking static target lib/librte_compressdev.a 00:01:50.124 [273/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:50.124 [274/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:50.124 [275/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:50.124 [276/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:50.124 [277/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.124 [278/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:50.124 [279/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:50.124 [280/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:50.124 [281/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.124 [282/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:50.124 [283/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:50.124 [284/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:50.124 [285/740] Linking static target lib/librte_rcu.a 00:01:50.124 [286/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:50.124 [287/740] Linking static target lib/librte_mempool.a 00:01:50.124 [288/740] Linking static target lib/librte_rawdev.a 00:01:50.124 [289/740] Generating lib/rte_table_def with a custom command 00:01:50.124 [290/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:50.124 [291/740] Generating lib/rte_table_mingw with a custom command 00:01:50.124 [292/740] Linking static target lib/librte_bbdev.a 00:01:50.124 [293/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:50.124 [294/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:50.124 [295/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:50.124 [296/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:50.124 [297/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:50.124 [298/740] Linking static target lib/librte_gpudev.a 00:01:50.124 [299/740] Linking static target lib/librte_gro.a 00:01:50.124 [300/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.124 [301/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.124 [302/740] Linking static target lib/librte_dmadev.a 00:01:50.124 [303/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:50.124 [304/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:50.124 [305/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.124 [306/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.386 [307/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:50.386 [308/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:50.386 [309/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:50.386 [310/740] Generating lib/rte_pipeline_def with a custom command 00:01:50.386 [311/740] Generating lib/rte_pipeline_mingw with a custom command 00:01:50.386 [312/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:50.386 [313/740] Linking target lib/librte_telemetry.so.23.0 00:01:50.386 [314/740] Linking static target lib/librte_gso.a 00:01:50.386 [315/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:50.386 [316/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.386 [317/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:01:50.386 [318/740] Linking static target lib/librte_latencystats.a 00:01:50.386 [319/740] Generating lib/rte_graph_def with a custom command 00:01:50.386 [320/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:50.386 [321/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:50.386 [322/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:50.386 [323/740] Generating lib/rte_graph_mingw with a custom command 00:01:50.386 [324/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:50.386 [325/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:50.386 [326/740] Linking static target lib/librte_distributor.a 00:01:50.386 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:50.386 [328/740] Linking static target lib/librte_ip_frag.a 00:01:50.386 [329/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:50.386 [330/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:50.386 [331/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:50.386 [332/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:50.386 [333/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:50.386 [334/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:50.645 [335/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:50.645 [336/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:50.645 [337/740] Linking static target lib/librte_regexdev.a 00:01:50.645 [338/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:50.645 [339/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:50.645 [340/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:01:50.645 [341/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:50.645 [342/740] Generating lib/rte_node_def with a custom command 00:01:50.645 [343/740] Generating lib/rte_node_mingw with a custom command 00:01:50.645 [344/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:50.645 [345/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.645 [346/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:50.645 [347/740] Generating drivers/rte_bus_pci_def with a custom command 00:01:50.645 [348/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.645 [349/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:50.645 [350/740] Linking static target lib/librte_eal.a 00:01:50.645 [351/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:01:50.645 [352/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:50.645 [353/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:50.645 [354/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:50.645 [355/740] Generating drivers/rte_bus_vdev_def with a custom command 00:01:50.645 [356/740] Linking static target lib/librte_power.a 00:01:50.645 [357/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:01:50.645 [358/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.645 [359/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:50.645 [360/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.645 [361/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:50.645 [362/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:50.645 [363/740] Generating drivers/rte_mempool_ring_def with a custom command 00:01:50.645 [364/740] Linking static target lib/librte_reorder.a 00:01:50.645 [365/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:01:50.645 [366/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:50.645 [367/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:50.645 [368/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:50.646 [369/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:50.646 [370/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:50.646 [371/740] Linking static target lib/librte_security.a 00:01:50.646 [372/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:50.646 [373/740] Linking static target lib/librte_pcapng.a 00:01:50.646 [374/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:50.905 [375/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:50.905 [376/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:50.905 [377/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:50.905 [378/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.905 [379/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:50.905 [380/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:50.905 [381/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:50.905 [382/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:50.905 [383/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:50.905 [384/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.905 [385/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:50.905 [386/740] Linking static target lib/librte_bpf.a 00:01:50.905 [387/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:50.905 [388/740] Generating drivers/rte_net_i40e_def with a custom command 00:01:50.905 [389/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.905 [390/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:01:50.905 [391/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:50.905 [392/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:50.905 [393/740] Linking static target lib/librte_mbuf.a 00:01:50.905 [394/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:50.905 [395/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:50.905 [396/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:50.905 [397/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:50.905 [398/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:50.905 [399/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:50.905 [400/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:50.905 [401/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:50.905 [402/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:50.905 [403/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:50.905 [404/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:50.905 [405/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:50.905 [406/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:50.905 [407/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:50.905 [408/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:50.905 [409/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:50.905 [410/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:50.905 [411/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:51.170 [412/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:51.170 [413/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:51.170 [414/740] Linking static target lib/librte_lpm.a 00:01:51.170 [415/740] Linking static target lib/librte_rib.a 00:01:51.170 [416/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:51.170 [417/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.170 [418/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:51.170 [419/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.170 [420/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.170 [421/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:51.170 [422/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.170 [423/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:51.170 [424/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:51.170 [425/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:51.170 [426/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:51.170 [427/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:51.170 [428/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:51.170 [429/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:51.170 [430/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:51.170 [431/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:51.170 [432/740] Linking static target lib/librte_graph.a 00:01:51.170 [433/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:51.170 [434/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:51.170 [435/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:51.170 [436/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:51.170 [437/740] Linking static target lib/librte_efd.a 00:01:51.170 [438/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:51.170 [439/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.170 [440/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:51.170 [441/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:51.170 [442/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:51.170 [443/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:51.432 [444/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:51.432 [445/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:51.432 [446/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:51.432 [447/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:51.432 [448/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.432 [449/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:51.432 [450/740] Linking static target drivers/librte_bus_vdev.a 00:01:51.432 [451/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:51.432 [452/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:51.432 [453/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:51.432 [454/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:51.432 [455/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:51.432 [456/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:51.432 [457/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.432 [458/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:51.432 [459/740] Linking static target lib/librte_fib.a 00:01:51.432 [460/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.432 [461/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.712 [462/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:51.712 [463/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.712 [464/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.712 [465/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:51.712 [466/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:51.712 [467/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:51.712 [468/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.712 [469/740] Linking static target lib/librte_pdump.a 00:01:51.713 [470/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:51.713 [471/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:51.713 [472/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:51.713 [473/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.713 [474/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:51.713 [475/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:51.713 [476/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:51.713 [477/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:51.713 [478/740] Linking static target drivers/librte_bus_pci.a 00:01:51.713 [479/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:51.713 [480/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:51.713 [481/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:51.713 [482/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:51.713 [483/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:51.713 [484/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.713 [485/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.713 [486/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:52.003 [487/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.003 [488/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:52.003 [489/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:52.003 [490/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:52.003 [491/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:52.003 [492/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:52.003 [493/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:52.003 [494/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:52.003 [495/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:52.003 [496/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:52.003 [497/740] Linking static target lib/librte_table.a 00:01:52.003 [498/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.003 [499/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:52.003 [500/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:52.003 [501/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:52.003 [502/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:52.003 [503/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:52.003 [504/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:52.003 [505/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:52.003 [506/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:52.003 [507/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:52.003 [508/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.003 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:52.003 [510/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:52.003 [511/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:52.261 [512/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:52.261 [513/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.261 [514/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:52.261 [515/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:52.261 [516/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:52.261 [517/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:52.261 [518/740] Linking static target lib/librte_cryptodev.a 00:01:52.261 [519/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:52.261 [520/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:52.261 [521/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:52.261 [522/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:52.261 [523/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:52.261 [524/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:52.261 [525/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:52.261 [526/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:52.261 [527/740] Linking static target lib/librte_sched.a 00:01:52.261 [528/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:52.261 [529/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.261 [530/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:52.261 [531/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:52.261 [532/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:52.261 [533/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:52.261 [534/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:52.261 [535/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:52.261 [536/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:52.261 [537/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:52.261 [538/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:52.519 [539/740] Linking static target lib/librte_node.a 00:01:52.519 [540/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:52.519 [541/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:52.519 [542/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:52.519 [543/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:52.519 [544/740] Linking static target lib/librte_ipsec.a 00:01:52.519 [545/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:52.519 [546/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:52.519 [547/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:52.519 [548/740] Linking static target drivers/librte_mempool_ring.a 00:01:52.519 [549/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:52.519 [550/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:52.519 [551/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:52.519 [552/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.519 [553/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:52.519 [554/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:52.519 [555/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:52.519 [556/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:52.519 [557/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:52.519 [558/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:52.519 [559/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:52.519 [560/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:52.519 [561/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:52.519 [562/740] Linking static target lib/librte_ethdev.a 00:01:52.519 [563/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:52.519 [564/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:52.519 [565/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:52.519 [566/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:52.519 [567/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:52.519 [568/740] Linking static target lib/librte_member.a 00:01:52.519 [569/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:52.519 [570/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:52.777 [571/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.777 [572/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:52.777 [573/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:52.777 [574/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:52.777 [575/740] Linking static target lib/librte_port.a 00:01:52.777 [576/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:52.777 [577/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:52.777 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:52.777 [579/740] Linking static target lib/librte_eventdev.a 00:01:52.777 [580/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:52.777 [581/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:52.777 [582/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:52.777 [583/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.777 [584/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:52.777 [585/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:52.777 [586/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:52.777 [587/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:52.777 [588/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:52.777 [589/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:52.777 [590/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:52.777 [591/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.777 [592/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.034 [593/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:53.034 [594/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:53.034 [595/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:01:53.034 [596/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:53.034 [597/740] Linking static target lib/librte_hash.a 00:01:53.034 [598/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:53.034 [599/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:53.034 [600/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:53.034 [601/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:53.034 [602/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:53.034 [603/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.292 [604/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:53.292 [605/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:53.292 [606/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:01:53.292 [607/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:53.292 [608/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:53.292 [609/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:53.292 [610/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:53.549 [611/740] Linking static target lib/librte_acl.a 00:01:53.549 [612/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.549 [613/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:53.549 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:53.807 [615/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:53.807 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:53.807 [617/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.065 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.065 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:54.322 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:54.322 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:55.254 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:55.254 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:55.254 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:55.254 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:55.254 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:55.512 [627/740] Linking static target drivers/librte_net_i40e.a 00:01:55.770 [628/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.770 [629/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:56.029 [630/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.288 [631/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:56.288 [632/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:56.288 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.596 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.855 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:02.114 [636/740] Linking static target lib/librte_vhost.a 00:02:02.681 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:02.681 [638/740] Linking static target lib/librte_pipeline.a 00:02:02.941 [639/740] Linking target app/dpdk-dumpcap 00:02:02.941 [640/740] Linking target app/dpdk-proc-info 00:02:02.941 [641/740] Linking target app/dpdk-test-cmdline 00:02:02.941 [642/740] Linking target app/dpdk-test-regex 00:02:02.941 [643/740] Linking target app/dpdk-test-gpudev 00:02:02.941 [644/740] Linking target app/dpdk-test-fib 00:02:02.941 [645/740] Linking target app/dpdk-test-acl 00:02:02.941 [646/740] Linking target app/dpdk-pdump 00:02:02.941 [647/740] Linking target app/dpdk-test-flow-perf 00:02:02.941 [648/740] Linking target app/dpdk-test-crypto-perf 00:02:02.941 [649/740] Linking target app/dpdk-test-bbdev 00:02:02.941 [650/740] Linking target app/dpdk-test-eventdev 00:02:02.941 [651/740] Linking target app/dpdk-test-sad 00:02:02.941 [652/740] Linking target app/dpdk-test-compress-perf 00:02:02.941 [653/740] Linking target app/dpdk-test-pipeline 00:02:02.941 [654/740] Linking target app/dpdk-test-security-perf 00:02:02.941 [655/740] Linking target app/dpdk-testpmd 00:02:03.879 [656/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.139 [657/740] Linking target lib/librte_eal.so.23.0 00:02:04.139 [658/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.139 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:02:04.397 [660/740] Linking target lib/librte_ring.so.23.0 00:02:04.397 [661/740] Linking target lib/librte_jobstats.so.23.0 00:02:04.397 [662/740] Linking target lib/librte_pci.so.23.0 00:02:04.397 [663/740] Linking target lib/librte_meter.so.23.0 00:02:04.397 [664/740] Linking target lib/librte_cfgfile.so.23.0 00:02:04.397 [665/740] Linking target lib/librte_dmadev.so.23.0 00:02:04.397 [666/740] Linking target lib/librte_timer.so.23.0 00:02:04.397 [667/740] Linking target lib/librte_rawdev.so.23.0 00:02:04.397 [668/740] Linking target lib/librte_stack.so.23.0 00:02:04.397 [669/740] Linking target drivers/librte_bus_vdev.so.23.0 00:02:04.397 [670/740] Linking target lib/librte_graph.so.23.0 00:02:04.397 [671/740] Linking target lib/librte_acl.so.23.0 00:02:04.397 [672/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:02:04.397 [673/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:02:04.397 [674/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:02:04.397 [675/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:02:04.397 [676/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:02:04.397 [677/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:02:04.397 [678/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:02:04.397 [679/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:02:04.397 [680/740] Linking target lib/librte_mempool.so.23.0 00:02:04.397 [681/740] Linking target lib/librte_rcu.so.23.0 00:02:04.397 [682/740] Linking target drivers/librte_bus_pci.so.23.0 00:02:04.655 [683/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:02:04.655 [684/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:02:04.655 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:02:04.655 [686/740] Linking target drivers/librte_mempool_ring.so.23.0 00:02:04.655 [687/740] Linking target lib/librte_rib.so.23.0 00:02:04.655 [688/740] Linking target lib/librte_mbuf.so.23.0 00:02:04.914 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:02:04.914 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:02:04.914 [691/740] Linking target lib/librte_compressdev.so.23.0 00:02:04.914 [692/740] Linking target lib/librte_reorder.so.23.0 00:02:04.914 [693/740] Linking target lib/librte_fib.so.23.0 00:02:04.914 [694/740] Linking target lib/librte_regexdev.so.23.0 00:02:04.914 [695/740] Linking target lib/librte_bbdev.so.23.0 00:02:04.914 [696/740] Linking target lib/librte_net.so.23.0 00:02:04.914 [697/740] Linking target lib/librte_gpudev.so.23.0 00:02:04.914 [698/740] Linking target lib/librte_distributor.so.23.0 00:02:04.914 [699/740] Linking target lib/librte_cryptodev.so.23.0 00:02:04.914 [700/740] Linking target lib/librte_sched.so.23.0 00:02:04.914 [701/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:02:04.914 [702/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:02:04.914 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:02:05.172 [704/740] Linking target lib/librte_cmdline.so.23.0 00:02:05.172 [705/740] Linking target lib/librte_hash.so.23.0 00:02:05.172 [706/740] Linking target lib/librte_security.so.23.0 00:02:05.172 [707/740] Linking target lib/librte_ethdev.so.23.0 00:02:05.172 [708/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:02:05.172 [709/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:02:05.172 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:02:05.172 [711/740] Linking target lib/librte_efd.so.23.0 00:02:05.172 [712/740] Linking target lib/librte_ipsec.so.23.0 00:02:05.172 [713/740] Linking target lib/librte_lpm.so.23.0 00:02:05.172 [714/740] Linking target lib/librte_member.so.23.0 00:02:05.172 [715/740] Linking target lib/librte_gso.so.23.0 00:02:05.172 [716/740] Linking target lib/librte_metrics.so.23.0 00:02:05.172 [717/740] Linking target lib/librte_gro.so.23.0 00:02:05.172 [718/740] Linking target lib/librte_pcapng.so.23.0 00:02:05.172 [719/740] Linking target lib/librte_bpf.so.23.0 00:02:05.172 [720/740] Linking target lib/librte_ip_frag.so.23.0 00:02:05.172 [721/740] Linking target lib/librte_power.so.23.0 00:02:05.172 [722/740] Linking target lib/librte_eventdev.so.23.0 00:02:05.172 [723/740] Linking target lib/librte_vhost.so.23.0 00:02:05.431 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:02:05.431 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:02:05.431 [726/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:02:05.431 [727/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:02:05.431 [728/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:02:05.431 [729/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:02:05.431 [730/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:02:05.431 [731/740] Linking target lib/librte_node.so.23.0 00:02:05.431 [732/740] Linking target lib/librte_latencystats.so.23.0 00:02:05.431 [733/740] Linking target lib/librte_bitratestats.so.23.0 00:02:05.431 [734/740] Linking target lib/librte_port.so.23.0 00:02:05.431 [735/740] Linking target lib/librte_pdump.so.23.0 00:02:05.690 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:02:05.690 [737/740] Linking target lib/librte_table.so.23.0 00:02:05.690 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:07.594 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.594 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:07.594 16:10:36 -- common/autobuild_common.sh@187 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:07.594 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:07.857 [0/1] Installing files. 00:02:07.857 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:07.857 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.858 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.859 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:07.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:08.124 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:08.124 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:08.124 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:08.124 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.124 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:08.125 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:08.125 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:08.125 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.125 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:08.125 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.125 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.126 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.127 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.389 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:08.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:08.391 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:08.391 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:08.391 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:08.391 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:08.391 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:08.391 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:08.391 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:08.391 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:08.391 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:08.391 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:08.391 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:08.391 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:08.391 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:08.391 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:08.391 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:08.391 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:08.391 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:08.391 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:08.391 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:08.391 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:08.391 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:08.391 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:08.391 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:08.391 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:08.391 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:08.391 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:08.391 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:08.391 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:08.391 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:08.391 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:08.391 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:08.391 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:08.391 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:08.391 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:08.391 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:08.391 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:08.391 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:08.391 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:08.391 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:08.391 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:08.391 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:08.391 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:08.391 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:08.391 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:08.391 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:08.391 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:08.391 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:08.391 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:08.391 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:08.391 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:08.391 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:08.391 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:08.391 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:08.391 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:08.391 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:08.391 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:08.391 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:08.391 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:08.391 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:08.391 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:08.391 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:08.391 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:08.391 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:08.391 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:08.391 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:08.391 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:08.391 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:08.391 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:08.391 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:08.391 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:08.392 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:08.392 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:08.392 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:08.392 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:08.392 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:08.392 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:08.392 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:08.392 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:08.392 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:08.392 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:08.392 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:08.392 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:08.392 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:08.392 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:08.392 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:08.392 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:08.392 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:08.392 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:08.392 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:08.392 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:08.392 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:08.392 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:08.392 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:08.392 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:08.392 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:08.392 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:08.392 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:08.392 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:08.392 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:08.392 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:08.392 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:08.392 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:08.392 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:08.392 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:08.392 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:08.392 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:08.392 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:08.392 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:08.392 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:08.392 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:08.392 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:08.392 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:08.392 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:08.392 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:08.392 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:08.392 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:08.392 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:08.392 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:08.392 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:08.392 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:08.392 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:08.392 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:08.392 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:08.392 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:08.392 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:08.392 16:10:36 -- common/autobuild_common.sh@189 -- $ uname -s 00:02:08.392 16:10:36 -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:08.392 16:10:36 -- common/autobuild_common.sh@200 -- $ cat 00:02:08.392 16:10:37 -- common/autobuild_common.sh@205 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:08.392 00:02:08.392 real 0m25.364s 00:02:08.392 user 6m32.091s 00:02:08.392 sys 2m13.109s 00:02:08.392 16:10:37 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:08.392 16:10:37 -- common/autotest_common.sh@10 -- $ set +x 00:02:08.392 ************************************ 00:02:08.392 END TEST build_native_dpdk 00:02:08.392 ************************************ 00:02:08.392 16:10:37 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:08.392 16:10:37 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:08.392 16:10:37 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:08.392 16:10:37 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:08.392 16:10:37 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:08.392 16:10:37 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:02:08.392 16:10:37 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:08.392 16:10:37 -- common/autotest_common.sh@10 -- $ set +x 00:02:08.392 ************************************ 00:02:08.392 START TEST autobuild_llvm_precompile 00:02:08.392 ************************************ 00:02:08.392 16:10:37 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:02:08.392 16:10:37 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:08.392 16:10:37 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:02:08.392 Target: x86_64-redhat-linux-gnu 00:02:08.392 Thread model: posix 00:02:08.392 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:08.392 16:10:37 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:02:08.392 16:10:37 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:02:08.392 16:10:37 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:02:08.392 16:10:37 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:02:08.392 16:10:37 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:02:08.392 16:10:37 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:02:08.392 16:10:37 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:08.392 16:10:37 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:02:08.392 16:10:37 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:02:08.392 16:10:37 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:08.652 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:08.911 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:08.911 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:08.911 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:09.478 Using 'verbs' RDMA provider 00:02:24.976 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:39.865 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:39.865 Creating mk/config.mk...done. 00:02:39.865 Creating mk/cc.flags.mk...done. 00:02:39.865 Type 'make' to build. 00:02:39.865 00:02:39.865 real 0m29.635s 00:02:39.865 user 0m12.618s 00:02:39.865 sys 0m16.337s 00:02:39.865 16:11:06 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:39.865 16:11:06 -- common/autotest_common.sh@10 -- $ set +x 00:02:39.865 ************************************ 00:02:39.865 END TEST autobuild_llvm_precompile 00:02:39.865 ************************************ 00:02:39.865 16:11:06 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:39.865 16:11:06 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:39.865 16:11:06 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:39.865 16:11:06 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:39.865 16:11:06 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:39.865 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:39.865 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:39.865 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:39.865 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:39.865 Using 'verbs' RDMA provider 00:02:52.077 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:03:04.283 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:03:04.283 Creating mk/config.mk...done. 00:03:04.283 Creating mk/cc.flags.mk...done. 00:03:04.283 Type 'make' to build. 00:03:04.283 16:11:31 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:03:04.283 16:11:31 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:03:04.283 16:11:31 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:03:04.283 16:11:31 -- common/autotest_common.sh@10 -- $ set +x 00:03:04.283 ************************************ 00:03:04.283 START TEST make 00:03:04.283 ************************************ 00:03:04.283 16:11:31 -- common/autotest_common.sh@1104 -- $ make -j112 00:03:04.283 make[1]: Nothing to be done for 'all'. 00:03:04.848 The Meson build system 00:03:04.848 Version: 1.3.1 00:03:04.848 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:04.848 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:04.848 Build type: native build 00:03:04.848 Project name: libvfio-user 00:03:04.848 Project version: 0.0.1 00:03:04.848 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:03:04.848 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:03:04.848 Host machine cpu family: x86_64 00:03:04.848 Host machine cpu: x86_64 00:03:04.848 Run-time dependency threads found: YES 00:03:04.848 Library dl found: YES 00:03:04.848 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:04.848 Run-time dependency json-c found: YES 0.17 00:03:04.848 Run-time dependency cmocka found: YES 1.1.7 00:03:04.848 Program pytest-3 found: NO 00:03:04.848 Program flake8 found: NO 00:03:04.848 Program misspell-fixer found: NO 00:03:04.848 Program restructuredtext-lint found: NO 00:03:04.848 Program valgrind found: YES (/usr/bin/valgrind) 00:03:04.848 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:04.848 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:04.848 Compiler for C supports arguments -Wwrite-strings: YES 00:03:04.848 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:04.848 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:04.848 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:04.848 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:04.848 Build targets in project: 8 00:03:04.848 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:04.848 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:04.848 00:03:04.848 libvfio-user 0.0.1 00:03:04.848 00:03:04.848 User defined options 00:03:04.848 buildtype : debug 00:03:04.848 default_library: static 00:03:04.848 libdir : /usr/local/lib 00:03:04.848 00:03:04.848 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:05.105 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:05.106 [1/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:05.106 [2/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:05.106 [3/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:05.106 [4/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:05.106 [5/36] Compiling C object samples/null.p/null.c.o 00:03:05.106 [6/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:05.106 [7/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:05.106 [8/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:05.106 [9/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:05.106 [10/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:05.106 [11/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:05.106 [12/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:05.106 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:05.106 [14/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:05.106 [15/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:05.106 [16/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:05.106 [17/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:05.106 [18/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:05.106 [19/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:05.106 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:05.106 [21/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:05.106 [22/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:05.106 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:05.106 [24/36] Compiling C object samples/server.p/server.c.o 00:03:05.106 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:05.106 [26/36] Compiling C object samples/client.p/client.c.o 00:03:05.106 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:05.106 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:05.364 [29/36] Linking static target lib/libvfio-user.a 00:03:05.364 [30/36] Linking target samples/client 00:03:05.364 [31/36] Linking target samples/server 00:03:05.364 [32/36] Linking target test/unit_tests 00:03:05.364 [33/36] Linking target samples/null 00:03:05.364 [34/36] Linking target samples/gpio-pci-idio-16 00:03:05.364 [35/36] Linking target samples/lspci 00:03:05.364 [36/36] Linking target samples/shadow_ioeventfd_server 00:03:05.364 INFO: autodetecting backend as ninja 00:03:05.364 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:05.364 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:05.623 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:05.623 ninja: no work to do. 00:03:08.913 CC lib/ut_mock/mock.o 00:03:08.913 CC lib/ut/ut.o 00:03:08.913 CC lib/log/log.o 00:03:08.913 CC lib/log/log_flags.o 00:03:08.913 CC lib/log/log_deprecated.o 00:03:08.913 LIB libspdk_ut_mock.a 00:03:08.913 LIB libspdk_ut.a 00:03:08.913 LIB libspdk_log.a 00:03:09.173 CXX lib/trace_parser/trace.o 00:03:09.173 CC lib/dma/dma.o 00:03:09.173 CC lib/ioat/ioat.o 00:03:09.173 CC lib/util/base64.o 00:03:09.173 CC lib/util/bit_array.o 00:03:09.173 CC lib/util/cpuset.o 00:03:09.173 CC lib/util/crc32.o 00:03:09.173 CC lib/util/crc16.o 00:03:09.173 CC lib/util/crc32c.o 00:03:09.173 CC lib/util/crc32_ieee.o 00:03:09.173 CC lib/util/fd.o 00:03:09.173 CC lib/util/crc64.o 00:03:09.173 CC lib/util/dif.o 00:03:09.173 CC lib/util/file.o 00:03:09.173 CC lib/util/hexlify.o 00:03:09.173 CC lib/util/iov.o 00:03:09.173 CC lib/util/math.o 00:03:09.173 CC lib/util/pipe.o 00:03:09.173 CC lib/util/strerror_tls.o 00:03:09.173 CC lib/util/string.o 00:03:09.173 CC lib/util/uuid.o 00:03:09.173 CC lib/util/fd_group.o 00:03:09.173 CC lib/util/xor.o 00:03:09.173 CC lib/util/zipf.o 00:03:09.432 LIB libspdk_dma.a 00:03:09.432 CC lib/vfio_user/host/vfio_user_pci.o 00:03:09.432 CC lib/vfio_user/host/vfio_user.o 00:03:09.432 LIB libspdk_ioat.a 00:03:09.432 LIB libspdk_vfio_user.a 00:03:09.432 LIB libspdk_util.a 00:03:09.690 LIB libspdk_trace_parser.a 00:03:09.949 CC lib/vmd/vmd.o 00:03:09.949 CC lib/idxd/idxd.o 00:03:09.949 CC lib/vmd/led.o 00:03:09.949 CC lib/idxd/idxd_user.o 00:03:09.949 CC lib/idxd/idxd_kernel.o 00:03:09.949 CC lib/env_dpdk/env.o 00:03:09.949 CC lib/env_dpdk/pci.o 00:03:09.949 CC lib/env_dpdk/memory.o 00:03:09.949 CC lib/env_dpdk/init.o 00:03:09.949 CC lib/env_dpdk/threads.o 00:03:09.949 CC lib/env_dpdk/pci_virtio.o 00:03:09.949 CC lib/env_dpdk/pci_vmd.o 00:03:09.949 CC lib/env_dpdk/pci_ioat.o 00:03:09.949 CC lib/env_dpdk/pci_idxd.o 00:03:09.949 CC lib/env_dpdk/pci_event.o 00:03:09.949 CC lib/env_dpdk/sigbus_handler.o 00:03:09.949 CC lib/env_dpdk/pci_dpdk.o 00:03:09.949 CC lib/conf/conf.o 00:03:09.949 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:09.949 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:09.949 CC lib/rdma/common.o 00:03:09.949 CC lib/rdma/rdma_verbs.o 00:03:09.949 CC lib/json/json_parse.o 00:03:09.949 CC lib/json/json_util.o 00:03:09.949 CC lib/json/json_write.o 00:03:09.949 LIB libspdk_conf.a 00:03:09.949 LIB libspdk_rdma.a 00:03:10.209 LIB libspdk_json.a 00:03:10.209 LIB libspdk_idxd.a 00:03:10.209 LIB libspdk_vmd.a 00:03:10.468 CC lib/jsonrpc/jsonrpc_server.o 00:03:10.468 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:10.468 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:10.468 CC lib/jsonrpc/jsonrpc_client.o 00:03:10.468 LIB libspdk_jsonrpc.a 00:03:10.727 LIB libspdk_env_dpdk.a 00:03:10.986 CC lib/rpc/rpc.o 00:03:10.986 LIB libspdk_rpc.a 00:03:11.245 CC lib/sock/sock_rpc.o 00:03:11.245 CC lib/sock/sock.o 00:03:11.245 CC lib/trace/trace.o 00:03:11.245 CC lib/trace/trace_flags.o 00:03:11.245 CC lib/trace/trace_rpc.o 00:03:11.245 CC lib/notify/notify.o 00:03:11.245 CC lib/notify/notify_rpc.o 00:03:11.504 LIB libspdk_notify.a 00:03:11.504 LIB libspdk_trace.a 00:03:11.504 LIB libspdk_sock.a 00:03:11.763 CC lib/thread/thread.o 00:03:11.763 CC lib/thread/iobuf.o 00:03:12.021 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:12.021 CC lib/nvme/nvme_ctrlr.o 00:03:12.021 CC lib/nvme/nvme_fabric.o 00:03:12.021 CC lib/nvme/nvme_ns_cmd.o 00:03:12.021 CC lib/nvme/nvme_pcie.o 00:03:12.021 CC lib/nvme/nvme_ns.o 00:03:12.021 CC lib/nvme/nvme_pcie_common.o 00:03:12.021 CC lib/nvme/nvme_qpair.o 00:03:12.021 CC lib/nvme/nvme.o 00:03:12.021 CC lib/nvme/nvme_quirks.o 00:03:12.021 CC lib/nvme/nvme_transport.o 00:03:12.021 CC lib/nvme/nvme_discovery.o 00:03:12.021 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:12.021 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:12.021 CC lib/nvme/nvme_tcp.o 00:03:12.021 CC lib/nvme/nvme_opal.o 00:03:12.021 CC lib/nvme/nvme_io_msg.o 00:03:12.021 CC lib/nvme/nvme_poll_group.o 00:03:12.021 CC lib/nvme/nvme_zns.o 00:03:12.021 CC lib/nvme/nvme_cuse.o 00:03:12.021 CC lib/nvme/nvme_vfio_user.o 00:03:12.021 CC lib/nvme/nvme_rdma.o 00:03:12.588 LIB libspdk_thread.a 00:03:12.846 CC lib/virtio/virtio.o 00:03:12.846 CC lib/virtio/virtio_vhost_user.o 00:03:12.846 CC lib/virtio/virtio_vfio_user.o 00:03:12.846 CC lib/virtio/virtio_pci.o 00:03:12.846 CC lib/blob/blobstore.o 00:03:12.846 CC lib/blob/request.o 00:03:12.846 CC lib/blob/blob_bs_dev.o 00:03:12.846 CC lib/blob/zeroes.o 00:03:12.846 CC lib/accel/accel.o 00:03:12.846 CC lib/accel/accel_rpc.o 00:03:12.846 CC lib/accel/accel_sw.o 00:03:12.846 CC lib/init/json_config.o 00:03:12.846 CC lib/init/subsystem.o 00:03:12.846 CC lib/init/subsystem_rpc.o 00:03:12.846 CC lib/vfu_tgt/tgt_endpoint.o 00:03:12.846 CC lib/init/rpc.o 00:03:12.846 CC lib/vfu_tgt/tgt_rpc.o 00:03:13.126 LIB libspdk_init.a 00:03:13.126 LIB libspdk_virtio.a 00:03:13.126 LIB libspdk_nvme.a 00:03:13.126 LIB libspdk_vfu_tgt.a 00:03:13.385 CC lib/event/app.o 00:03:13.385 CC lib/event/reactor.o 00:03:13.385 CC lib/event/log_rpc.o 00:03:13.385 CC lib/event/app_rpc.o 00:03:13.385 CC lib/event/scheduler_static.o 00:03:13.643 LIB libspdk_accel.a 00:03:13.643 LIB libspdk_event.a 00:03:13.902 CC lib/bdev/bdev.o 00:03:13.902 CC lib/bdev/part.o 00:03:13.902 CC lib/bdev/bdev_rpc.o 00:03:13.902 CC lib/bdev/bdev_zone.o 00:03:13.902 CC lib/bdev/scsi_nvme.o 00:03:14.518 LIB libspdk_blob.a 00:03:14.775 CC lib/blobfs/blobfs.o 00:03:14.775 CC lib/blobfs/tree.o 00:03:14.775 CC lib/lvol/lvol.o 00:03:15.342 LIB libspdk_lvol.a 00:03:15.342 LIB libspdk_blobfs.a 00:03:15.599 LIB libspdk_bdev.a 00:03:15.857 CC lib/nbd/nbd.o 00:03:15.857 CC lib/nbd/nbd_rpc.o 00:03:15.857 CC lib/ublk/ublk.o 00:03:15.857 CC lib/ublk/ublk_rpc.o 00:03:15.857 CC lib/scsi/dev.o 00:03:15.857 CC lib/scsi/lun.o 00:03:15.857 CC lib/scsi/port.o 00:03:15.857 CC lib/scsi/scsi.o 00:03:15.857 CC lib/scsi/scsi_pr.o 00:03:15.857 CC lib/ftl/ftl_core.o 00:03:15.857 CC lib/scsi/scsi_bdev.o 00:03:15.857 CC lib/ftl/ftl_init.o 00:03:15.857 CC lib/scsi/scsi_rpc.o 00:03:15.857 CC lib/scsi/task.o 00:03:15.857 CC lib/ftl/ftl_layout.o 00:03:15.857 CC lib/ftl/ftl_debug.o 00:03:15.857 CC lib/nvmf/ctrlr.o 00:03:15.857 CC lib/ftl/ftl_io.o 00:03:15.857 CC lib/nvmf/ctrlr_discovery.o 00:03:15.857 CC lib/ftl/ftl_sb.o 00:03:15.857 CC lib/nvmf/ctrlr_bdev.o 00:03:15.857 CC lib/ftl/ftl_nv_cache.o 00:03:15.857 CC lib/ftl/ftl_l2p.o 00:03:15.857 CC lib/nvmf/subsystem.o 00:03:15.857 CC lib/ftl/ftl_l2p_flat.o 00:03:15.857 CC lib/nvmf/nvmf.o 00:03:15.857 CC lib/nvmf/nvmf_rpc.o 00:03:15.857 CC lib/ftl/ftl_band.o 00:03:15.857 CC lib/nvmf/transport.o 00:03:15.857 CC lib/ftl/ftl_band_ops.o 00:03:15.857 CC lib/nvmf/tcp.o 00:03:15.857 CC lib/ftl/ftl_writer.o 00:03:15.857 CC lib/nvmf/vfio_user.o 00:03:15.857 CC lib/ftl/ftl_rq.o 00:03:15.857 CC lib/nvmf/rdma.o 00:03:15.857 CC lib/ftl/ftl_reloc.o 00:03:15.857 CC lib/ftl/ftl_l2p_cache.o 00:03:15.857 CC lib/ftl/ftl_p2l.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:15.857 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:15.857 CC lib/ftl/utils/ftl_conf.o 00:03:15.857 CC lib/ftl/utils/ftl_md.o 00:03:15.857 CC lib/ftl/utils/ftl_mempool.o 00:03:15.857 CC lib/ftl/utils/ftl_bitmap.o 00:03:15.857 CC lib/ftl/utils/ftl_property.o 00:03:15.857 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:15.857 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:15.857 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:15.857 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:15.857 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:15.857 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:15.857 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:15.857 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:15.857 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:15.857 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:15.857 CC lib/ftl/base/ftl_base_dev.o 00:03:15.857 CC lib/ftl/base/ftl_base_bdev.o 00:03:15.857 CC lib/ftl/ftl_trace.o 00:03:16.115 LIB libspdk_nbd.a 00:03:16.115 LIB libspdk_scsi.a 00:03:16.115 LIB libspdk_ublk.a 00:03:16.375 LIB libspdk_ftl.a 00:03:16.634 CC lib/iscsi/conn.o 00:03:16.634 CC lib/iscsi/init_grp.o 00:03:16.634 CC lib/iscsi/iscsi.o 00:03:16.634 CC lib/iscsi/portal_grp.o 00:03:16.634 CC lib/iscsi/md5.o 00:03:16.634 CC lib/iscsi/param.o 00:03:16.634 CC lib/iscsi/tgt_node.o 00:03:16.634 CC lib/iscsi/iscsi_subsystem.o 00:03:16.634 CC lib/iscsi/iscsi_rpc.o 00:03:16.634 CC lib/iscsi/task.o 00:03:16.634 CC lib/vhost/vhost.o 00:03:16.634 CC lib/vhost/vhost_rpc.o 00:03:16.634 CC lib/vhost/rte_vhost_user.o 00:03:16.634 CC lib/vhost/vhost_scsi.o 00:03:16.634 CC lib/vhost/vhost_blk.o 00:03:16.893 LIB libspdk_nvmf.a 00:03:17.151 LIB libspdk_vhost.a 00:03:17.411 LIB libspdk_iscsi.a 00:03:17.669 CC module/env_dpdk/env_dpdk_rpc.o 00:03:17.669 CC module/vfu_device/vfu_virtio.o 00:03:17.669 CC module/vfu_device/vfu_virtio_blk.o 00:03:17.669 CC module/vfu_device/vfu_virtio_rpc.o 00:03:17.669 CC module/vfu_device/vfu_virtio_scsi.o 00:03:17.928 CC module/accel/error/accel_error.o 00:03:17.928 CC module/accel/error/accel_error_rpc.o 00:03:17.928 CC module/accel/ioat/accel_ioat.o 00:03:17.928 CC module/accel/ioat/accel_ioat_rpc.o 00:03:17.928 CC module/blob/bdev/blob_bdev.o 00:03:17.928 LIB libspdk_env_dpdk_rpc.a 00:03:17.928 CC module/sock/posix/posix.o 00:03:17.928 CC module/accel/dsa/accel_dsa.o 00:03:17.928 CC module/accel/dsa/accel_dsa_rpc.o 00:03:17.928 CC module/accel/iaa/accel_iaa.o 00:03:17.928 CC module/accel/iaa/accel_iaa_rpc.o 00:03:17.928 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:17.928 CC module/scheduler/gscheduler/gscheduler.o 00:03:17.928 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:17.928 LIB libspdk_accel_error.a 00:03:17.928 LIB libspdk_accel_ioat.a 00:03:17.928 LIB libspdk_scheduler_dpdk_governor.a 00:03:17.928 LIB libspdk_scheduler_gscheduler.a 00:03:17.928 LIB libspdk_scheduler_dynamic.a 00:03:17.928 LIB libspdk_blob_bdev.a 00:03:17.928 LIB libspdk_accel_dsa.a 00:03:18.199 LIB libspdk_accel_iaa.a 00:03:18.199 LIB libspdk_vfu_device.a 00:03:18.199 LIB libspdk_sock_posix.a 00:03:18.457 CC module/bdev/passthru/vbdev_passthru.o 00:03:18.457 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:18.457 CC module/bdev/null/bdev_null.o 00:03:18.457 CC module/bdev/null/bdev_null_rpc.o 00:03:18.457 CC module/bdev/split/vbdev_split_rpc.o 00:03:18.457 CC module/bdev/split/vbdev_split.o 00:03:18.457 CC module/bdev/raid/bdev_raid.o 00:03:18.457 CC module/bdev/raid/raid0.o 00:03:18.457 CC module/bdev/raid/bdev_raid_rpc.o 00:03:18.457 CC module/bdev/malloc/bdev_malloc.o 00:03:18.457 CC module/bdev/raid/bdev_raid_sb.o 00:03:18.457 CC module/bdev/raid/raid1.o 00:03:18.457 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:18.457 CC module/bdev/aio/bdev_aio_rpc.o 00:03:18.457 CC module/bdev/raid/concat.o 00:03:18.457 CC module/bdev/lvol/vbdev_lvol.o 00:03:18.457 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:18.457 CC module/bdev/aio/bdev_aio.o 00:03:18.457 CC module/bdev/nvme/bdev_nvme.o 00:03:18.457 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:18.457 CC module/bdev/nvme/nvme_rpc.o 00:03:18.457 CC module/bdev/gpt/gpt.o 00:03:18.457 CC module/bdev/iscsi/bdev_iscsi.o 00:03:18.457 CC module/bdev/nvme/bdev_mdns_client.o 00:03:18.457 CC module/bdev/gpt/vbdev_gpt.o 00:03:18.457 CC module/bdev/nvme/vbdev_opal.o 00:03:18.457 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:18.457 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:18.457 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:18.457 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:18.457 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:18.457 CC module/bdev/error/vbdev_error.o 00:03:18.457 CC module/bdev/error/vbdev_error_rpc.o 00:03:18.457 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:18.457 CC module/blobfs/bdev/blobfs_bdev.o 00:03:18.457 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:18.457 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:18.457 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:18.457 CC module/bdev/ftl/bdev_ftl.o 00:03:18.457 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:18.457 CC module/bdev/delay/vbdev_delay.o 00:03:18.457 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:18.715 LIB libspdk_blobfs_bdev.a 00:03:18.715 LIB libspdk_bdev_split.a 00:03:18.715 LIB libspdk_bdev_null.a 00:03:18.715 LIB libspdk_bdev_gpt.a 00:03:18.715 LIB libspdk_bdev_passthru.a 00:03:18.715 LIB libspdk_bdev_error.a 00:03:18.715 LIB libspdk_bdev_aio.a 00:03:18.715 LIB libspdk_bdev_ftl.a 00:03:18.715 LIB libspdk_bdev_iscsi.a 00:03:18.715 LIB libspdk_bdev_zone_block.a 00:03:18.715 LIB libspdk_bdev_malloc.a 00:03:18.715 LIB libspdk_bdev_delay.a 00:03:18.715 LIB libspdk_bdev_lvol.a 00:03:18.974 LIB libspdk_bdev_virtio.a 00:03:18.974 LIB libspdk_bdev_raid.a 00:03:19.911 LIB libspdk_bdev_nvme.a 00:03:20.171 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:20.430 CC module/event/subsystems/sock/sock.o 00:03:20.430 CC module/event/subsystems/scheduler/scheduler.o 00:03:20.430 CC module/event/subsystems/vmd/vmd.o 00:03:20.430 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:20.430 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:20.430 CC module/event/subsystems/iobuf/iobuf.o 00:03:20.430 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:20.430 LIB libspdk_event_vfu_tgt.a 00:03:20.430 LIB libspdk_event_sock.a 00:03:20.430 LIB libspdk_event_scheduler.a 00:03:20.430 LIB libspdk_event_vmd.a 00:03:20.430 LIB libspdk_event_vhost_blk.a 00:03:20.430 LIB libspdk_event_iobuf.a 00:03:20.688 CC module/event/subsystems/accel/accel.o 00:03:20.947 LIB libspdk_event_accel.a 00:03:21.205 CC module/event/subsystems/bdev/bdev.o 00:03:21.464 LIB libspdk_event_bdev.a 00:03:21.723 CC module/event/subsystems/ublk/ublk.o 00:03:21.723 CC module/event/subsystems/nbd/nbd.o 00:03:21.723 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:21.723 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:21.723 CC module/event/subsystems/scsi/scsi.o 00:03:21.723 LIB libspdk_event_ublk.a 00:03:21.723 LIB libspdk_event_nbd.a 00:03:21.723 LIB libspdk_event_scsi.a 00:03:21.723 LIB libspdk_event_nvmf.a 00:03:21.982 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:21.982 CC module/event/subsystems/iscsi/iscsi.o 00:03:22.251 LIB libspdk_event_vhost_scsi.a 00:03:22.251 LIB libspdk_event_iscsi.a 00:03:22.513 TEST_HEADER include/spdk/accel.h 00:03:22.513 TEST_HEADER include/spdk/accel_module.h 00:03:22.513 TEST_HEADER include/spdk/assert.h 00:03:22.513 TEST_HEADER include/spdk/barrier.h 00:03:22.513 TEST_HEADER include/spdk/base64.h 00:03:22.513 TEST_HEADER include/spdk/bdev.h 00:03:22.513 TEST_HEADER include/spdk/bdev_module.h 00:03:22.513 CC app/spdk_nvme_discover/discovery_aer.o 00:03:22.513 TEST_HEADER include/spdk/bdev_zone.h 00:03:22.513 TEST_HEADER include/spdk/bit_pool.h 00:03:22.513 TEST_HEADER include/spdk/blob_bdev.h 00:03:22.513 TEST_HEADER include/spdk/bit_array.h 00:03:22.513 CC app/spdk_nvme_perf/perf.o 00:03:22.513 CC app/trace_record/trace_record.o 00:03:22.513 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:22.513 TEST_HEADER include/spdk/blobfs.h 00:03:22.513 CC app/spdk_top/spdk_top.o 00:03:22.513 TEST_HEADER include/spdk/blob.h 00:03:22.513 CC test/rpc_client/rpc_client_test.o 00:03:22.513 TEST_HEADER include/spdk/conf.h 00:03:22.513 TEST_HEADER include/spdk/cpuset.h 00:03:22.513 TEST_HEADER include/spdk/config.h 00:03:22.513 TEST_HEADER include/spdk/crc16.h 00:03:22.513 TEST_HEADER include/spdk/crc32.h 00:03:22.513 TEST_HEADER include/spdk/crc64.h 00:03:22.513 TEST_HEADER include/spdk/dif.h 00:03:22.513 CC app/spdk_lspci/spdk_lspci.o 00:03:22.513 CXX app/trace/trace.o 00:03:22.513 TEST_HEADER include/spdk/dma.h 00:03:22.513 TEST_HEADER include/spdk/endian.h 00:03:22.513 TEST_HEADER include/spdk/env_dpdk.h 00:03:22.513 TEST_HEADER include/spdk/env.h 00:03:22.513 TEST_HEADER include/spdk/event.h 00:03:22.513 TEST_HEADER include/spdk/fd_group.h 00:03:22.513 TEST_HEADER include/spdk/fd.h 00:03:22.513 TEST_HEADER include/spdk/file.h 00:03:22.513 TEST_HEADER include/spdk/ftl.h 00:03:22.513 TEST_HEADER include/spdk/gpt_spec.h 00:03:22.513 TEST_HEADER include/spdk/histogram_data.h 00:03:22.513 TEST_HEADER include/spdk/hexlify.h 00:03:22.513 CC app/nvmf_tgt/nvmf_main.o 00:03:22.513 TEST_HEADER include/spdk/idxd.h 00:03:22.513 CC app/spdk_nvme_identify/identify.o 00:03:22.513 TEST_HEADER include/spdk/idxd_spec.h 00:03:22.513 CC app/spdk_dd/spdk_dd.o 00:03:22.513 TEST_HEADER include/spdk/ioat.h 00:03:22.513 TEST_HEADER include/spdk/init.h 00:03:22.513 TEST_HEADER include/spdk/ioat_spec.h 00:03:22.513 TEST_HEADER include/spdk/json.h 00:03:22.513 TEST_HEADER include/spdk/iscsi_spec.h 00:03:22.513 TEST_HEADER include/spdk/jsonrpc.h 00:03:22.513 TEST_HEADER include/spdk/likely.h 00:03:22.513 TEST_HEADER include/spdk/log.h 00:03:22.513 TEST_HEADER include/spdk/lvol.h 00:03:22.513 TEST_HEADER include/spdk/memory.h 00:03:22.513 TEST_HEADER include/spdk/mmio.h 00:03:22.513 TEST_HEADER include/spdk/nbd.h 00:03:22.513 TEST_HEADER include/spdk/nvme.h 00:03:22.513 TEST_HEADER include/spdk/notify.h 00:03:22.513 TEST_HEADER include/spdk/nvme_intel.h 00:03:22.513 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:22.513 TEST_HEADER include/spdk/nvme_spec.h 00:03:22.513 TEST_HEADER include/spdk/nvme_zns.h 00:03:22.513 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:22.513 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:22.513 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:22.513 TEST_HEADER include/spdk/nvmf.h 00:03:22.513 TEST_HEADER include/spdk/nvmf_spec.h 00:03:22.513 TEST_HEADER include/spdk/nvmf_transport.h 00:03:22.513 TEST_HEADER include/spdk/opal.h 00:03:22.513 TEST_HEADER include/spdk/opal_spec.h 00:03:22.513 TEST_HEADER include/spdk/pipe.h 00:03:22.513 TEST_HEADER include/spdk/pci_ids.h 00:03:22.513 TEST_HEADER include/spdk/reduce.h 00:03:22.513 TEST_HEADER include/spdk/queue.h 00:03:22.513 TEST_HEADER include/spdk/rpc.h 00:03:22.513 TEST_HEADER include/spdk/scheduler.h 00:03:22.513 TEST_HEADER include/spdk/scsi.h 00:03:22.513 TEST_HEADER include/spdk/scsi_spec.h 00:03:22.513 TEST_HEADER include/spdk/sock.h 00:03:22.513 CC app/vhost/vhost.o 00:03:22.513 TEST_HEADER include/spdk/string.h 00:03:22.513 TEST_HEADER include/spdk/stdinc.h 00:03:22.513 TEST_HEADER include/spdk/thread.h 00:03:22.513 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:22.513 TEST_HEADER include/spdk/trace.h 00:03:22.513 TEST_HEADER include/spdk/trace_parser.h 00:03:22.513 TEST_HEADER include/spdk/tree.h 00:03:22.513 TEST_HEADER include/spdk/ublk.h 00:03:22.513 TEST_HEADER include/spdk/util.h 00:03:22.513 CC app/spdk_tgt/spdk_tgt.o 00:03:22.513 TEST_HEADER include/spdk/uuid.h 00:03:22.513 TEST_HEADER include/spdk/version.h 00:03:22.513 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:22.513 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:22.513 TEST_HEADER include/spdk/vhost.h 00:03:22.513 TEST_HEADER include/spdk/vmd.h 00:03:22.513 TEST_HEADER include/spdk/zipf.h 00:03:22.513 TEST_HEADER include/spdk/xor.h 00:03:22.513 CC app/iscsi_tgt/iscsi_tgt.o 00:03:22.513 CXX test/cpp_headers/accel.o 00:03:22.513 CXX test/cpp_headers/accel_module.o 00:03:22.513 CXX test/cpp_headers/assert.o 00:03:22.513 CXX test/cpp_headers/barrier.o 00:03:22.513 CXX test/cpp_headers/base64.o 00:03:22.513 CXX test/cpp_headers/bdev.o 00:03:22.513 CXX test/cpp_headers/bdev_module.o 00:03:22.513 CXX test/cpp_headers/bdev_zone.o 00:03:22.772 CXX test/cpp_headers/bit_array.o 00:03:22.772 CXX test/cpp_headers/bit_pool.o 00:03:22.772 CXX test/cpp_headers/blob_bdev.o 00:03:22.772 CXX test/cpp_headers/blobfs_bdev.o 00:03:22.772 CXX test/cpp_headers/blobfs.o 00:03:22.772 CXX test/cpp_headers/blob.o 00:03:22.772 CXX test/cpp_headers/conf.o 00:03:22.772 CXX test/cpp_headers/config.o 00:03:22.772 CXX test/cpp_headers/cpuset.o 00:03:22.772 CXX test/cpp_headers/crc16.o 00:03:22.772 CXX test/cpp_headers/crc32.o 00:03:22.772 CXX test/cpp_headers/dif.o 00:03:22.772 CXX test/cpp_headers/crc64.o 00:03:22.772 CXX test/cpp_headers/dma.o 00:03:22.772 CXX test/cpp_headers/endian.o 00:03:22.772 CXX test/cpp_headers/env_dpdk.o 00:03:22.772 CXX test/cpp_headers/env.o 00:03:22.772 CXX test/cpp_headers/fd.o 00:03:22.772 CXX test/cpp_headers/fd_group.o 00:03:22.772 CXX test/cpp_headers/event.o 00:03:22.772 CXX test/cpp_headers/file.o 00:03:22.772 CXX test/cpp_headers/ftl.o 00:03:22.772 CXX test/cpp_headers/hexlify.o 00:03:22.772 CXX test/cpp_headers/gpt_spec.o 00:03:22.772 CXX test/cpp_headers/histogram_data.o 00:03:22.772 CXX test/cpp_headers/idxd.o 00:03:22.772 CXX test/cpp_headers/idxd_spec.o 00:03:22.772 CXX test/cpp_headers/init.o 00:03:22.772 CC examples/ioat/perf/perf.o 00:03:22.772 CC test/nvme/simple_copy/simple_copy.o 00:03:22.772 CC test/nvme/reset/reset.o 00:03:22.772 CC test/app/stub/stub.o 00:03:22.772 CC test/app/jsoncat/jsoncat.o 00:03:22.772 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:22.772 CC test/event/event_perf/event_perf.o 00:03:22.773 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:22.773 CC examples/nvme/hotplug/hotplug.o 00:03:22.773 CC examples/nvme/reconnect/reconnect.o 00:03:22.773 CC test/app/histogram_perf/histogram_perf.o 00:03:22.773 CC examples/accel/perf/accel_perf.o 00:03:22.773 CC examples/vmd/lsvmd/lsvmd.o 00:03:22.773 CC test/nvme/sgl/sgl.o 00:03:22.773 CC examples/vmd/led/led.o 00:03:22.773 CC examples/nvme/hello_world/hello_world.o 00:03:22.773 CC app/fio/nvme/fio_plugin.o 00:03:22.773 CC examples/util/zipf/zipf.o 00:03:22.773 CC test/nvme/overhead/overhead.o 00:03:22.773 CC test/nvme/err_injection/err_injection.o 00:03:22.773 CC examples/nvme/arbitration/arbitration.o 00:03:22.773 CC examples/idxd/perf/perf.o 00:03:22.773 CC test/thread/lock/spdk_lock.o 00:03:22.773 CC examples/nvme/abort/abort.o 00:03:22.773 CC examples/ioat/verify/verify.o 00:03:22.773 CC test/accel/dif/dif.o 00:03:22.773 CC test/event/reactor_perf/reactor_perf.o 00:03:22.773 CC test/nvme/e2edp/nvme_dp.o 00:03:22.773 CC test/nvme/aer/aer.o 00:03:22.773 CC test/nvme/boot_partition/boot_partition.o 00:03:22.773 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:22.773 CC test/thread/poller_perf/poller_perf.o 00:03:22.773 CC test/nvme/fused_ordering/fused_ordering.o 00:03:22.773 CC test/nvme/fdp/fdp.o 00:03:22.773 CC test/nvme/startup/startup.o 00:03:22.773 CC examples/sock/hello_world/hello_sock.o 00:03:22.773 CC test/nvme/connect_stress/connect_stress.o 00:03:22.773 CC test/env/vtophys/vtophys.o 00:03:22.773 CC test/nvme/cuse/cuse.o 00:03:22.773 CC test/nvme/reserve/reserve.o 00:03:22.773 CC test/env/memory/memory_ut.o 00:03:22.773 CC test/event/reactor/reactor.o 00:03:22.773 CC test/nvme/compliance/nvme_compliance.o 00:03:22.773 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:22.773 CXX test/cpp_headers/ioat.o 00:03:22.773 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:22.773 CC test/env/pci/pci_ut.o 00:03:22.773 CC test/event/app_repeat/app_repeat.o 00:03:22.773 CC test/app/bdev_svc/bdev_svc.o 00:03:22.773 CC app/fio/bdev/fio_plugin.o 00:03:22.773 CC examples/nvmf/nvmf/nvmf.o 00:03:22.773 CC examples/blob/cli/blobcli.o 00:03:22.773 CC examples/thread/thread/thread_ex.o 00:03:22.773 CC examples/blob/hello_world/hello_blob.o 00:03:22.773 CC test/blobfs/mkfs/mkfs.o 00:03:22.773 LINK spdk_lspci 00:03:22.773 CC test/dma/test_dma/test_dma.o 00:03:22.773 CC test/bdev/bdevio/bdevio.o 00:03:22.773 LINK rpc_client_test 00:03:22.773 CC test/event/scheduler/scheduler.o 00:03:22.773 CC examples/bdev/hello_world/hello_bdev.o 00:03:22.773 CC examples/bdev/bdevperf/bdevperf.o 00:03:22.773 CC test/lvol/esnap/esnap.o 00:03:22.773 LINK spdk_nvme_discover 00:03:22.773 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:22.773 CC test/env/mem_callbacks/mem_callbacks.o 00:03:22.773 LINK nvmf_tgt 00:03:22.773 LINK spdk_trace_record 00:03:22.773 CXX test/cpp_headers/ioat_spec.o 00:03:22.773 CXX test/cpp_headers/iscsi_spec.o 00:03:22.773 CXX test/cpp_headers/json.o 00:03:22.773 LINK vhost 00:03:23.045 CXX test/cpp_headers/jsonrpc.o 00:03:23.045 CXX test/cpp_headers/likely.o 00:03:23.045 CXX test/cpp_headers/log.o 00:03:23.045 LINK jsoncat 00:03:23.045 CXX test/cpp_headers/lvol.o 00:03:23.045 LINK interrupt_tgt 00:03:23.045 CXX test/cpp_headers/memory.o 00:03:23.045 CXX test/cpp_headers/mmio.o 00:03:23.045 CXX test/cpp_headers/nbd.o 00:03:23.045 CXX test/cpp_headers/notify.o 00:03:23.045 LINK lsvmd 00:03:23.045 CXX test/cpp_headers/nvme.o 00:03:23.045 CXX test/cpp_headers/nvme_intel.o 00:03:23.045 CXX test/cpp_headers/nvme_ocssd.o 00:03:23.045 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:23.045 CXX test/cpp_headers/nvme_spec.o 00:03:23.045 CXX test/cpp_headers/nvme_zns.o 00:03:23.045 CXX test/cpp_headers/nvmf_cmd.o 00:03:23.045 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:23.045 LINK histogram_perf 00:03:23.045 CXX test/cpp_headers/nvmf.o 00:03:23.045 CXX test/cpp_headers/nvmf_spec.o 00:03:23.045 CXX test/cpp_headers/nvmf_transport.o 00:03:23.045 CXX test/cpp_headers/opal.o 00:03:23.045 LINK reactor_perf 00:03:23.045 CXX test/cpp_headers/opal_spec.o 00:03:23.045 LINK reactor 00:03:23.045 CXX test/cpp_headers/pci_ids.o 00:03:23.045 LINK event_perf 00:03:23.045 CXX test/cpp_headers/pipe.o 00:03:23.045 CXX test/cpp_headers/queue.o 00:03:23.045 LINK zipf 00:03:23.045 CXX test/cpp_headers/reduce.o 00:03:23.045 LINK spdk_tgt 00:03:23.045 CXX test/cpp_headers/rpc.o 00:03:23.045 CXX test/cpp_headers/scheduler.o 00:03:23.045 LINK led 00:03:23.045 LINK poller_perf 00:03:23.045 LINK iscsi_tgt 00:03:23.045 LINK vtophys 00:03:23.045 LINK env_dpdk_post_init 00:03:23.045 LINK pmr_persistence 00:03:23.045 LINK stub 00:03:23.045 LINK boot_partition 00:03:23.045 CXX test/cpp_headers/scsi.o 00:03:23.045 LINK startup 00:03:23.045 LINK app_repeat 00:03:23.045 LINK err_injection 00:03:23.045 CXX test/cpp_headers/scsi_spec.o 00:03:23.045 CXX test/cpp_headers/sock.o 00:03:23.045 LINK verify 00:03:23.045 LINK doorbell_aers 00:03:23.045 LINK fused_ordering 00:03:23.045 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:03:23.045 struct spdk_nvme_fdp_ruhs ruhs; 00:03:23.045 ^ 00:03:23.045 LINK reserve 00:03:23.045 LINK cmb_copy 00:03:23.045 LINK connect_stress 00:03:23.045 CXX test/cpp_headers/stdinc.o 00:03:23.045 LINK bdev_svc 00:03:23.045 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:23.045 LINK ioat_perf 00:03:23.045 LINK hello_world 00:03:23.045 LINK mkfs 00:03:23.045 LINK hello_sock 00:03:23.045 LINK simple_copy 00:03:23.045 LINK hotplug 00:03:23.045 LINK aer 00:03:23.045 LINK reset 00:03:23.045 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:23.045 LINK fdp 00:03:23.045 LINK overhead 00:03:23.045 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:23.045 LINK scheduler 00:03:23.045 LINK hello_blob 00:03:23.045 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:23.045 LINK nvme_dp 00:03:23.045 LINK hello_bdev 00:03:23.045 LINK sgl 00:03:23.045 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:23.045 LINK thread 00:03:23.045 LINK nvmf 00:03:23.045 LINK idxd_perf 00:03:23.045 LINK mem_callbacks 00:03:23.045 LINK spdk_trace 00:03:23.045 CXX test/cpp_headers/string.o 00:03:23.045 CXX test/cpp_headers/trace.o 00:03:23.045 CXX test/cpp_headers/thread.o 00:03:23.045 CXX test/cpp_headers/trace_parser.o 00:03:23.304 CXX test/cpp_headers/tree.o 00:03:23.304 CXX test/cpp_headers/ublk.o 00:03:23.304 CXX test/cpp_headers/util.o 00:03:23.304 CXX test/cpp_headers/uuid.o 00:03:23.304 CXX test/cpp_headers/version.o 00:03:23.304 CXX test/cpp_headers/vfio_user_pci.o 00:03:23.304 CXX test/cpp_headers/vfio_user_spec.o 00:03:23.304 CXX test/cpp_headers/vhost.o 00:03:23.304 CXX test/cpp_headers/vmd.o 00:03:23.304 CXX test/cpp_headers/xor.o 00:03:23.304 LINK abort 00:03:23.304 CXX test/cpp_headers/zipf.o 00:03:23.304 LINK arbitration 00:03:23.304 LINK spdk_dd 00:03:23.304 LINK dif 00:03:23.304 LINK bdevio 00:03:23.304 LINK reconnect 00:03:23.304 LINK accel_perf 00:03:23.304 LINK test_dma 00:03:23.304 LINK nvme_compliance 00:03:23.304 LINK blobcli 00:03:23.304 LINK nvme_fuzz 00:03:23.304 LINK nvme_manage 00:03:23.561 LINK pci_ut 00:03:23.561 LINK spdk_nvme_perf 00:03:23.561 LINK llvm_vfio_fuzz 00:03:23.561 LINK memory_ut 00:03:23.561 LINK spdk_bdev 00:03:23.561 1 warning generated. 00:03:23.561 LINK spdk_nvme 00:03:23.818 LINK vhost_fuzz 00:03:23.818 LINK spdk_nvme_identify 00:03:23.818 LINK bdevperf 00:03:23.818 LINK cuse 00:03:23.818 LINK spdk_top 00:03:23.818 LINK llvm_nvme_fuzz 00:03:24.395 LINK spdk_lock 00:03:24.657 LINK iscsi_fuzz 00:03:26.556 LINK esnap 00:03:26.815 00:03:26.815 real 0m24.038s 00:03:26.815 user 4m35.936s 00:03:26.815 sys 1m52.417s 00:03:26.815 16:11:55 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:26.815 16:11:55 -- common/autotest_common.sh@10 -- $ set +x 00:03:26.815 ************************************ 00:03:26.815 END TEST make 00:03:26.815 ************************************ 00:03:26.815 16:11:55 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:26.815 16:11:55 -- nvmf/common.sh@7 -- # uname -s 00:03:26.815 16:11:55 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:26.815 16:11:55 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:26.815 16:11:55 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:26.815 16:11:55 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:26.815 16:11:55 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:26.815 16:11:55 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:26.815 16:11:55 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:26.815 16:11:55 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:26.815 16:11:55 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:26.816 16:11:55 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:26.816 16:11:55 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:26.816 16:11:55 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:26.816 16:11:55 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:26.816 16:11:55 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:26.816 16:11:55 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:26.816 16:11:55 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:26.816 16:11:55 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:26.816 16:11:55 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:27.074 16:11:55 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:27.074 16:11:55 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.074 16:11:55 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.074 16:11:55 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.074 16:11:55 -- paths/export.sh@5 -- # export PATH 00:03:27.074 16:11:55 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.074 16:11:55 -- nvmf/common.sh@46 -- # : 0 00:03:27.074 16:11:55 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:27.074 16:11:55 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:27.074 16:11:55 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:27.074 16:11:55 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:27.074 16:11:55 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:27.074 16:11:55 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:27.074 16:11:55 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:27.074 16:11:55 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:27.074 16:11:55 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:27.074 16:11:55 -- spdk/autotest.sh@32 -- # uname -s 00:03:27.074 16:11:55 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:27.074 16:11:55 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:27.074 16:11:55 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:27.074 16:11:55 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:27.074 16:11:55 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:27.074 16:11:55 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:27.074 16:11:55 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:27.074 16:11:55 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:27.074 16:11:55 -- spdk/autotest.sh@48 -- # udevadm_pid=2199857 00:03:27.074 16:11:55 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:27.074 16:11:55 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:27.074 16:11:55 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:27.074 16:11:55 -- spdk/autotest.sh@54 -- # echo 2199859 00:03:27.074 16:11:55 -- spdk/autotest.sh@56 -- # echo 2199860 00:03:27.074 16:11:55 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:27.074 16:11:55 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:27.074 16:11:55 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:27.074 16:11:55 -- spdk/autotest.sh@60 -- # echo 2199861 00:03:27.074 16:11:55 -- spdk/autotest.sh@62 -- # echo 2199862 00:03:27.074 16:11:55 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:27.074 16:11:55 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:27.074 16:11:55 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:27.074 16:11:55 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:27.074 16:11:55 -- common/autotest_common.sh@10 -- # set +x 00:03:27.074 16:11:55 -- spdk/autotest.sh@70 -- # create_test_list 00:03:27.074 16:11:55 -- common/autotest_common.sh@736 -- # xtrace_disable 00:03:27.074 16:11:55 -- common/autotest_common.sh@10 -- # set +x 00:03:27.074 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:27.074 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:27.074 16:11:55 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:27.074 16:11:55 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:27.074 16:11:55 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:27.074 16:11:55 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:27.074 16:11:55 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:27.074 16:11:55 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:27.074 16:11:55 -- common/autotest_common.sh@1440 -- # uname 00:03:27.074 16:11:55 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:03:27.074 16:11:55 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:27.074 16:11:55 -- common/autotest_common.sh@1460 -- # uname 00:03:27.074 16:11:55 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:03:27.074 16:11:55 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:03:27.074 16:11:55 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:03:27.074 16:11:55 -- spdk/autotest.sh@83 -- # hash lcov 00:03:27.074 16:11:55 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:27.074 16:11:55 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:03:27.074 16:11:55 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:27.074 16:11:55 -- common/autotest_common.sh@10 -- # set +x 00:03:27.074 16:11:55 -- spdk/autotest.sh@102 -- # rm -f 00:03:27.074 16:11:55 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:30.355 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:30.613 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:30.872 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:30.872 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:30.872 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:30.872 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:30.872 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:30.872 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:30.872 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:30.872 16:11:59 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:30.872 16:11:59 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:30.872 16:11:59 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:30.872 16:11:59 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:30.872 16:11:59 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:30.872 16:11:59 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:30.872 16:11:59 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:30.872 16:11:59 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:30.872 16:11:59 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:30.872 16:11:59 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:30.872 16:11:59 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:30.872 16:11:59 -- spdk/autotest.sh@121 -- # grep -v p 00:03:30.872 16:11:59 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:30.872 16:11:59 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:30.872 16:11:59 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:30.872 16:11:59 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:30.872 16:11:59 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:30.872 No valid GPT data, bailing 00:03:31.131 16:11:59 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:31.131 16:11:59 -- scripts/common.sh@393 -- # pt= 00:03:31.131 16:11:59 -- scripts/common.sh@394 -- # return 1 00:03:31.131 16:11:59 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:31.131 1+0 records in 00:03:31.131 1+0 records out 00:03:31.131 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00593087 s, 177 MB/s 00:03:31.131 16:11:59 -- spdk/autotest.sh@129 -- # sync 00:03:31.131 16:11:59 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:31.131 16:11:59 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:31.131 16:11:59 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:39.243 16:12:06 -- spdk/autotest.sh@135 -- # uname -s 00:03:39.243 16:12:06 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:39.243 16:12:06 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:39.243 16:12:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:39.243 16:12:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:39.243 16:12:06 -- common/autotest_common.sh@10 -- # set +x 00:03:39.243 ************************************ 00:03:39.243 START TEST setup.sh 00:03:39.243 ************************************ 00:03:39.243 16:12:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:39.243 * Looking for test storage... 00:03:39.243 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:39.243 16:12:06 -- setup/test-setup.sh@10 -- # uname -s 00:03:39.243 16:12:06 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:39.243 16:12:06 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:39.243 16:12:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:39.243 16:12:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:39.243 16:12:06 -- common/autotest_common.sh@10 -- # set +x 00:03:39.243 ************************************ 00:03:39.243 START TEST acl 00:03:39.243 ************************************ 00:03:39.243 16:12:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:39.243 * Looking for test storage... 00:03:39.243 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:39.243 16:12:06 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:39.243 16:12:06 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:39.243 16:12:06 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:39.243 16:12:06 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:39.243 16:12:06 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:39.244 16:12:06 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:39.244 16:12:06 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:39.244 16:12:06 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:39.244 16:12:06 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:39.244 16:12:06 -- setup/acl.sh@12 -- # devs=() 00:03:39.244 16:12:06 -- setup/acl.sh@12 -- # declare -a devs 00:03:39.244 16:12:06 -- setup/acl.sh@13 -- # drivers=() 00:03:39.244 16:12:06 -- setup/acl.sh@13 -- # declare -A drivers 00:03:39.244 16:12:06 -- setup/acl.sh@51 -- # setup reset 00:03:39.244 16:12:06 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:39.244 16:12:06 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:41.775 16:12:10 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:41.775 16:12:10 -- setup/acl.sh@16 -- # local dev driver 00:03:41.775 16:12:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.775 16:12:10 -- setup/acl.sh@15 -- # setup output status 00:03:41.775 16:12:10 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:41.775 16:12:10 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:45.063 Hugepages 00:03:45.063 node hugesize free / total 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 00:03:45.063 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # continue 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:45.063 16:12:13 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:45.063 16:12:13 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:45.063 16:12:13 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:45.063 16:12:13 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:45.063 16:12:13 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:45.063 16:12:13 -- setup/acl.sh@54 -- # run_test denied denied 00:03:45.063 16:12:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:45.063 16:12:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:45.063 16:12:13 -- common/autotest_common.sh@10 -- # set +x 00:03:45.063 ************************************ 00:03:45.063 START TEST denied 00:03:45.063 ************************************ 00:03:45.063 16:12:13 -- common/autotest_common.sh@1104 -- # denied 00:03:45.063 16:12:13 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:45.063 16:12:13 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:45.063 16:12:13 -- setup/acl.sh@38 -- # setup output config 00:03:45.063 16:12:13 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:45.063 16:12:13 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:48.496 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:48.496 16:12:17 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:48.496 16:12:17 -- setup/acl.sh@28 -- # local dev driver 00:03:48.496 16:12:17 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:48.496 16:12:17 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:48.496 16:12:17 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:48.496 16:12:17 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:48.496 16:12:17 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:48.496 16:12:17 -- setup/acl.sh@41 -- # setup reset 00:03:48.496 16:12:17 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:48.496 16:12:17 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:53.774 00:03:53.774 real 0m8.095s 00:03:53.774 user 0m2.547s 00:03:53.774 sys 0m4.902s 00:03:53.774 16:12:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:53.774 16:12:21 -- common/autotest_common.sh@10 -- # set +x 00:03:53.774 ************************************ 00:03:53.774 END TEST denied 00:03:53.774 ************************************ 00:03:53.774 16:12:21 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:53.774 16:12:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:53.774 16:12:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:53.774 16:12:21 -- common/autotest_common.sh@10 -- # set +x 00:03:53.774 ************************************ 00:03:53.774 START TEST allowed 00:03:53.774 ************************************ 00:03:53.774 16:12:21 -- common/autotest_common.sh@1104 -- # allowed 00:03:53.774 16:12:21 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:53.774 16:12:21 -- setup/acl.sh@45 -- # setup output config 00:03:53.774 16:12:21 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:53.774 16:12:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.774 16:12:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:57.973 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:57.973 16:12:26 -- setup/acl.sh@47 -- # verify 00:03:57.973 16:12:26 -- setup/acl.sh@28 -- # local dev driver 00:03:57.973 16:12:26 -- setup/acl.sh@48 -- # setup reset 00:03:57.973 16:12:26 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:57.973 16:12:26 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:02.160 00:04:02.160 real 0m8.825s 00:04:02.160 user 0m2.508s 00:04:02.160 sys 0m4.898s 00:04:02.160 16:12:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:02.160 16:12:30 -- common/autotest_common.sh@10 -- # set +x 00:04:02.160 ************************************ 00:04:02.160 END TEST allowed 00:04:02.160 ************************************ 00:04:02.160 00:04:02.160 real 0m23.923s 00:04:02.160 user 0m7.592s 00:04:02.160 sys 0m14.532s 00:04:02.160 16:12:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:02.160 16:12:30 -- common/autotest_common.sh@10 -- # set +x 00:04:02.160 ************************************ 00:04:02.160 END TEST acl 00:04:02.160 ************************************ 00:04:02.160 16:12:30 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:02.160 16:12:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:02.160 16:12:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:02.160 16:12:30 -- common/autotest_common.sh@10 -- # set +x 00:04:02.160 ************************************ 00:04:02.160 START TEST hugepages 00:04:02.160 ************************************ 00:04:02.160 16:12:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:02.160 * Looking for test storage... 00:04:02.160 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:02.160 16:12:30 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:02.160 16:12:30 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:02.160 16:12:30 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:02.160 16:12:30 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:02.160 16:12:30 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:02.160 16:12:30 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:02.160 16:12:30 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:02.160 16:12:30 -- setup/common.sh@18 -- # local node= 00:04:02.160 16:12:30 -- setup/common.sh@19 -- # local var val 00:04:02.160 16:12:30 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.160 16:12:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.160 16:12:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.160 16:12:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.160 16:12:30 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.160 16:12:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40211320 kB' 'MemAvailable: 42066612 kB' 'Buffers: 4300 kB' 'Cached: 11309544 kB' 'SwapCached: 28 kB' 'Active: 10442524 kB' 'Inactive: 1486324 kB' 'Active(anon): 9959268 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 618532 kB' 'Mapped: 177312 kB' 'Shmem: 9376476 kB' 'KReclaimable: 565620 kB' 'Slab: 1212564 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 646944 kB' 'KernelStack: 21952 kB' 'PageTables: 8908 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439068 kB' 'Committed_AS: 11391244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216180 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.160 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.160 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # continue 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.161 16:12:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.161 16:12:30 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:02.161 16:12:30 -- setup/common.sh@33 -- # echo 2048 00:04:02.161 16:12:30 -- setup/common.sh@33 -- # return 0 00:04:02.161 16:12:30 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:02.161 16:12:30 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:02.161 16:12:30 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:02.162 16:12:30 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:02.162 16:12:30 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:02.162 16:12:30 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:02.162 16:12:30 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:02.162 16:12:30 -- setup/hugepages.sh@207 -- # get_nodes 00:04:02.162 16:12:30 -- setup/hugepages.sh@27 -- # local node 00:04:02.162 16:12:30 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.162 16:12:30 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:02.162 16:12:30 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.162 16:12:30 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:02.162 16:12:30 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:02.162 16:12:30 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:02.162 16:12:30 -- setup/hugepages.sh@208 -- # clear_hp 00:04:02.162 16:12:30 -- setup/hugepages.sh@37 -- # local node hp 00:04:02.162 16:12:30 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:02.162 16:12:30 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.162 16:12:30 -- setup/hugepages.sh@41 -- # echo 0 00:04:02.162 16:12:30 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.162 16:12:30 -- setup/hugepages.sh@41 -- # echo 0 00:04:02.162 16:12:30 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:02.162 16:12:30 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.162 16:12:30 -- setup/hugepages.sh@41 -- # echo 0 00:04:02.162 16:12:30 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.162 16:12:30 -- setup/hugepages.sh@41 -- # echo 0 00:04:02.162 16:12:30 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:02.162 16:12:30 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:02.162 16:12:30 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:02.162 16:12:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:02.162 16:12:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:02.162 16:12:30 -- common/autotest_common.sh@10 -- # set +x 00:04:02.162 ************************************ 00:04:02.162 START TEST default_setup 00:04:02.162 ************************************ 00:04:02.162 16:12:30 -- common/autotest_common.sh@1104 -- # default_setup 00:04:02.162 16:12:30 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:02.162 16:12:30 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:02.162 16:12:30 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:02.162 16:12:30 -- setup/hugepages.sh@51 -- # shift 00:04:02.162 16:12:30 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:02.162 16:12:30 -- setup/hugepages.sh@52 -- # local node_ids 00:04:02.162 16:12:30 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:02.162 16:12:30 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:02.162 16:12:30 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:02.162 16:12:30 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:02.162 16:12:30 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:02.162 16:12:30 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:02.162 16:12:30 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:02.162 16:12:30 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:02.162 16:12:30 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:02.162 16:12:30 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:02.162 16:12:30 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:02.162 16:12:30 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:02.162 16:12:30 -- setup/hugepages.sh@73 -- # return 0 00:04:02.162 16:12:30 -- setup/hugepages.sh@137 -- # setup output 00:04:02.162 16:12:30 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.162 16:12:30 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:05.465 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:05.465 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:07.370 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:07.370 16:12:35 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:07.370 16:12:35 -- setup/hugepages.sh@89 -- # local node 00:04:07.370 16:12:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:07.370 16:12:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:07.370 16:12:35 -- setup/hugepages.sh@92 -- # local surp 00:04:07.370 16:12:35 -- setup/hugepages.sh@93 -- # local resv 00:04:07.370 16:12:35 -- setup/hugepages.sh@94 -- # local anon 00:04:07.370 16:12:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:07.370 16:12:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:07.370 16:12:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:07.370 16:12:35 -- setup/common.sh@18 -- # local node= 00:04:07.370 16:12:35 -- setup/common.sh@19 -- # local var val 00:04:07.370 16:12:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.370 16:12:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.370 16:12:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.370 16:12:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.370 16:12:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.370 16:12:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.370 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42401156 kB' 'MemAvailable: 44256448 kB' 'Buffers: 4300 kB' 'Cached: 11309676 kB' 'SwapCached: 28 kB' 'Active: 10456568 kB' 'Inactive: 1486324 kB' 'Active(anon): 9973312 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632348 kB' 'Mapped: 177424 kB' 'Shmem: 9376608 kB' 'KReclaimable: 565620 kB' 'Slab: 1210664 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645044 kB' 'KernelStack: 22048 kB' 'PageTables: 9112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11405072 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216388 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 16:12:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.372 16:12:35 -- setup/common.sh@33 -- # echo 0 00:04:07.372 16:12:35 -- setup/common.sh@33 -- # return 0 00:04:07.372 16:12:35 -- setup/hugepages.sh@97 -- # anon=0 00:04:07.372 16:12:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:07.372 16:12:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.372 16:12:35 -- setup/common.sh@18 -- # local node= 00:04:07.372 16:12:35 -- setup/common.sh@19 -- # local var val 00:04:07.372 16:12:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.372 16:12:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.372 16:12:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.372 16:12:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.372 16:12:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.372 16:12:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42403344 kB' 'MemAvailable: 44258636 kB' 'Buffers: 4300 kB' 'Cached: 11309676 kB' 'SwapCached: 28 kB' 'Active: 10457364 kB' 'Inactive: 1486324 kB' 'Active(anon): 9974108 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633220 kB' 'Mapped: 177416 kB' 'Shmem: 9376608 kB' 'KReclaimable: 565620 kB' 'Slab: 1210648 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645028 kB' 'KernelStack: 22016 kB' 'PageTables: 8716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11406460 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216404 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.372 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.372 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.373 16:12:35 -- setup/common.sh@33 -- # echo 0 00:04:07.373 16:12:35 -- setup/common.sh@33 -- # return 0 00:04:07.373 16:12:35 -- setup/hugepages.sh@99 -- # surp=0 00:04:07.373 16:12:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:07.373 16:12:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:07.373 16:12:35 -- setup/common.sh@18 -- # local node= 00:04:07.373 16:12:35 -- setup/common.sh@19 -- # local var val 00:04:07.373 16:12:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.373 16:12:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.373 16:12:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.373 16:12:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.373 16:12:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.373 16:12:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42404576 kB' 'MemAvailable: 44259868 kB' 'Buffers: 4300 kB' 'Cached: 11309676 kB' 'SwapCached: 28 kB' 'Active: 10456552 kB' 'Inactive: 1486324 kB' 'Active(anon): 9973296 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632384 kB' 'Mapped: 177280 kB' 'Shmem: 9376608 kB' 'KReclaimable: 565620 kB' 'Slab: 1210648 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645028 kB' 'KernelStack: 22160 kB' 'PageTables: 8876 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11406476 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216436 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.373 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.373 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.374 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.374 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.375 16:12:35 -- setup/common.sh@33 -- # echo 0 00:04:07.375 16:12:35 -- setup/common.sh@33 -- # return 0 00:04:07.375 16:12:35 -- setup/hugepages.sh@100 -- # resv=0 00:04:07.375 16:12:35 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:07.375 nr_hugepages=1024 00:04:07.375 16:12:35 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:07.375 resv_hugepages=0 00:04:07.375 16:12:35 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:07.375 surplus_hugepages=0 00:04:07.375 16:12:35 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:07.375 anon_hugepages=0 00:04:07.375 16:12:35 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.375 16:12:35 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:07.375 16:12:35 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:07.375 16:12:35 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:07.375 16:12:35 -- setup/common.sh@18 -- # local node= 00:04:07.375 16:12:35 -- setup/common.sh@19 -- # local var val 00:04:07.375 16:12:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.375 16:12:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.375 16:12:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.375 16:12:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.375 16:12:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.375 16:12:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42404040 kB' 'MemAvailable: 44259332 kB' 'Buffers: 4300 kB' 'Cached: 11309676 kB' 'SwapCached: 28 kB' 'Active: 10456492 kB' 'Inactive: 1486324 kB' 'Active(anon): 9973236 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632340 kB' 'Mapped: 177280 kB' 'Shmem: 9376608 kB' 'KReclaimable: 565620 kB' 'Slab: 1210640 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645020 kB' 'KernelStack: 22016 kB' 'PageTables: 9076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11406336 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216404 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.375 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.375 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.376 16:12:35 -- setup/common.sh@33 -- # echo 1024 00:04:07.376 16:12:35 -- setup/common.sh@33 -- # return 0 00:04:07.376 16:12:35 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.376 16:12:35 -- setup/hugepages.sh@112 -- # get_nodes 00:04:07.376 16:12:35 -- setup/hugepages.sh@27 -- # local node 00:04:07.376 16:12:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.376 16:12:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:07.376 16:12:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.376 16:12:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:07.376 16:12:35 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:07.376 16:12:35 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:07.376 16:12:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.376 16:12:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.376 16:12:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:07.376 16:12:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.376 16:12:35 -- setup/common.sh@18 -- # local node=0 00:04:07.376 16:12:35 -- setup/common.sh@19 -- # local var val 00:04:07.376 16:12:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.376 16:12:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.376 16:12:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:07.376 16:12:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:07.376 16:12:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.376 16:12:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23751152 kB' 'MemUsed: 8840932 kB' 'SwapCached: 16 kB' 'Active: 4524424 kB' 'Inactive: 338724 kB' 'Active(anon): 4306204 kB' 'Inactive(anon): 16 kB' 'Active(file): 218220 kB' 'Inactive(file): 338708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4477564 kB' 'Mapped: 110200 kB' 'AnonPages: 389224 kB' 'Shmem: 3920620 kB' 'KernelStack: 12968 kB' 'PageTables: 5056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 349476 kB' 'Slab: 640436 kB' 'SReclaimable: 349476 kB' 'SUnreclaim: 290960 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.376 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.376 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # continue 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.377 16:12:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.377 16:12:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.377 16:12:35 -- setup/common.sh@33 -- # echo 0 00:04:07.377 16:12:35 -- setup/common.sh@33 -- # return 0 00:04:07.377 16:12:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.377 16:12:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.377 16:12:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.377 16:12:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.377 16:12:35 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:07.377 node0=1024 expecting 1024 00:04:07.377 16:12:35 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:07.377 00:04:07.377 real 0m5.150s 00:04:07.377 user 0m1.376s 00:04:07.377 sys 0m2.275s 00:04:07.377 16:12:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:07.377 16:12:35 -- common/autotest_common.sh@10 -- # set +x 00:04:07.377 ************************************ 00:04:07.377 END TEST default_setup 00:04:07.377 ************************************ 00:04:07.377 16:12:35 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:07.377 16:12:35 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:07.377 16:12:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:07.377 16:12:35 -- common/autotest_common.sh@10 -- # set +x 00:04:07.377 ************************************ 00:04:07.377 START TEST per_node_1G_alloc 00:04:07.377 ************************************ 00:04:07.377 16:12:35 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:04:07.377 16:12:35 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:07.377 16:12:35 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:07.378 16:12:35 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:07.378 16:12:35 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:07.378 16:12:35 -- setup/hugepages.sh@51 -- # shift 00:04:07.378 16:12:35 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:07.378 16:12:35 -- setup/hugepages.sh@52 -- # local node_ids 00:04:07.378 16:12:35 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:07.378 16:12:35 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:07.378 16:12:35 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:07.378 16:12:35 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:07.378 16:12:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:07.378 16:12:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:07.378 16:12:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:07.378 16:12:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:07.378 16:12:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:07.378 16:12:35 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:07.378 16:12:35 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:07.378 16:12:35 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:07.378 16:12:35 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:07.378 16:12:35 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:07.378 16:12:35 -- setup/hugepages.sh@73 -- # return 0 00:04:07.378 16:12:35 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:07.378 16:12:35 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:07.378 16:12:35 -- setup/hugepages.sh@146 -- # setup output 00:04:07.378 16:12:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:07.378 16:12:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:10.668 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:10.668 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:10.668 16:12:39 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:10.668 16:12:39 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:10.668 16:12:39 -- setup/hugepages.sh@89 -- # local node 00:04:10.668 16:12:39 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:10.668 16:12:39 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:10.668 16:12:39 -- setup/hugepages.sh@92 -- # local surp 00:04:10.668 16:12:39 -- setup/hugepages.sh@93 -- # local resv 00:04:10.668 16:12:39 -- setup/hugepages.sh@94 -- # local anon 00:04:10.668 16:12:39 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:10.668 16:12:39 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:10.668 16:12:39 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:10.668 16:12:39 -- setup/common.sh@18 -- # local node= 00:04:10.668 16:12:39 -- setup/common.sh@19 -- # local var val 00:04:10.668 16:12:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.668 16:12:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.668 16:12:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.668 16:12:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.668 16:12:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.668 16:12:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.668 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.668 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.668 16:12:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42465700 kB' 'MemAvailable: 44320992 kB' 'Buffers: 4300 kB' 'Cached: 11309792 kB' 'SwapCached: 28 kB' 'Active: 10455124 kB' 'Inactive: 1486324 kB' 'Active(anon): 9971868 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630616 kB' 'Mapped: 177316 kB' 'Shmem: 9376724 kB' 'KReclaimable: 565620 kB' 'Slab: 1211388 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645768 kB' 'KernelStack: 21952 kB' 'PageTables: 8660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11403216 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216516 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:10.668 16:12:39 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.668 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.668 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.668 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.668 16:12:39 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.668 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.668 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.668 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.668 16:12:39 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.668 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.668 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.668 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.668 16:12:39 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.668 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.669 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.669 16:12:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.669 16:12:39 -- setup/common.sh@33 -- # echo 0 00:04:10.669 16:12:39 -- setup/common.sh@33 -- # return 0 00:04:10.669 16:12:39 -- setup/hugepages.sh@97 -- # anon=0 00:04:10.669 16:12:39 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:10.669 16:12:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.669 16:12:39 -- setup/common.sh@18 -- # local node= 00:04:10.669 16:12:39 -- setup/common.sh@19 -- # local var val 00:04:10.669 16:12:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.669 16:12:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.669 16:12:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.669 16:12:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.669 16:12:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.669 16:12:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.670 16:12:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42465484 kB' 'MemAvailable: 44320776 kB' 'Buffers: 4300 kB' 'Cached: 11309796 kB' 'SwapCached: 28 kB' 'Active: 10454852 kB' 'Inactive: 1486324 kB' 'Active(anon): 9971596 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630356 kB' 'Mapped: 177292 kB' 'Shmem: 9376728 kB' 'KReclaimable: 565620 kB' 'Slab: 1211480 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645860 kB' 'KernelStack: 21936 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11403228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216500 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.670 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.670 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.671 16:12:39 -- setup/common.sh@33 -- # echo 0 00:04:10.671 16:12:39 -- setup/common.sh@33 -- # return 0 00:04:10.671 16:12:39 -- setup/hugepages.sh@99 -- # surp=0 00:04:10.671 16:12:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:10.671 16:12:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:10.671 16:12:39 -- setup/common.sh@18 -- # local node= 00:04:10.671 16:12:39 -- setup/common.sh@19 -- # local var val 00:04:10.671 16:12:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.671 16:12:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.671 16:12:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.671 16:12:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.671 16:12:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.671 16:12:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42465484 kB' 'MemAvailable: 44320776 kB' 'Buffers: 4300 kB' 'Cached: 11309804 kB' 'SwapCached: 28 kB' 'Active: 10454884 kB' 'Inactive: 1486324 kB' 'Active(anon): 9971628 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630340 kB' 'Mapped: 177292 kB' 'Shmem: 9376736 kB' 'KReclaimable: 565620 kB' 'Slab: 1211480 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645860 kB' 'KernelStack: 21920 kB' 'PageTables: 8552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11403240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216500 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.671 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.671 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.672 16:12:39 -- setup/common.sh@33 -- # echo 0 00:04:10.672 16:12:39 -- setup/common.sh@33 -- # return 0 00:04:10.672 16:12:39 -- setup/hugepages.sh@100 -- # resv=0 00:04:10.672 16:12:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:10.672 nr_hugepages=1024 00:04:10.672 16:12:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:10.672 resv_hugepages=0 00:04:10.672 16:12:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:10.672 surplus_hugepages=0 00:04:10.672 16:12:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:10.672 anon_hugepages=0 00:04:10.672 16:12:39 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:10.672 16:12:39 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:10.672 16:12:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:10.672 16:12:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:10.672 16:12:39 -- setup/common.sh@18 -- # local node= 00:04:10.672 16:12:39 -- setup/common.sh@19 -- # local var val 00:04:10.672 16:12:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.672 16:12:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.672 16:12:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.672 16:12:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.672 16:12:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.672 16:12:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.672 16:12:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42465744 kB' 'MemAvailable: 44321036 kB' 'Buffers: 4300 kB' 'Cached: 11309836 kB' 'SwapCached: 28 kB' 'Active: 10454540 kB' 'Inactive: 1486324 kB' 'Active(anon): 9971284 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 629964 kB' 'Mapped: 177292 kB' 'Shmem: 9376768 kB' 'KReclaimable: 565620 kB' 'Slab: 1211480 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645860 kB' 'KernelStack: 21920 kB' 'PageTables: 8552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11403256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216500 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.672 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.672 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.673 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.673 16:12:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.674 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.674 16:12:39 -- setup/common.sh@33 -- # echo 1024 00:04:10.674 16:12:39 -- setup/common.sh@33 -- # return 0 00:04:10.674 16:12:39 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:10.674 16:12:39 -- setup/hugepages.sh@112 -- # get_nodes 00:04:10.674 16:12:39 -- setup/hugepages.sh@27 -- # local node 00:04:10.674 16:12:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:10.674 16:12:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:10.674 16:12:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:10.674 16:12:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:10.674 16:12:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:10.674 16:12:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:10.674 16:12:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:10.674 16:12:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:10.674 16:12:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:10.674 16:12:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.674 16:12:39 -- setup/common.sh@18 -- # local node=0 00:04:10.674 16:12:39 -- setup/common.sh@19 -- # local var val 00:04:10.674 16:12:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.674 16:12:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.674 16:12:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:10.674 16:12:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:10.674 16:12:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.674 16:12:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.674 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 16:12:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 24832332 kB' 'MemUsed: 7759752 kB' 'SwapCached: 16 kB' 'Active: 4523668 kB' 'Inactive: 338724 kB' 'Active(anon): 4305448 kB' 'Inactive(anon): 16 kB' 'Active(file): 218220 kB' 'Inactive(file): 338708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4477568 kB' 'Mapped: 110204 kB' 'AnonPages: 387952 kB' 'Shmem: 3920624 kB' 'KernelStack: 12856 kB' 'PageTables: 5148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 349476 kB' 'Slab: 641220 kB' 'SReclaimable: 349476 kB' 'SUnreclaim: 291744 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 16:12:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@33 -- # echo 0 00:04:10.935 16:12:39 -- setup/common.sh@33 -- # return 0 00:04:10.935 16:12:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:10.935 16:12:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:10.935 16:12:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:10.935 16:12:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:10.935 16:12:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.935 16:12:39 -- setup/common.sh@18 -- # local node=1 00:04:10.935 16:12:39 -- setup/common.sh@19 -- # local var val 00:04:10.935 16:12:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.935 16:12:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.935 16:12:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:10.935 16:12:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:10.935 16:12:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.935 16:12:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17635124 kB' 'MemUsed: 10068024 kB' 'SwapCached: 12 kB' 'Active: 5931076 kB' 'Inactive: 1147600 kB' 'Active(anon): 5666040 kB' 'Inactive(anon): 32196 kB' 'Active(file): 265036 kB' 'Inactive(file): 1115404 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6836608 kB' 'Mapped: 67088 kB' 'AnonPages: 242212 kB' 'Shmem: 5456156 kB' 'KernelStack: 9064 kB' 'PageTables: 3404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 216144 kB' 'Slab: 570256 kB' 'SReclaimable: 216144 kB' 'SUnreclaim: 354112 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 16:12:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # continue 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 16:12:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 16:12:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.936 16:12:39 -- setup/common.sh@33 -- # echo 0 00:04:10.936 16:12:39 -- setup/common.sh@33 -- # return 0 00:04:10.936 16:12:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:10.936 16:12:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:10.936 16:12:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:10.936 16:12:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:10.936 16:12:39 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:10.936 node0=512 expecting 512 00:04:10.936 16:12:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:10.936 16:12:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:10.936 16:12:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:10.936 16:12:39 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:10.936 node1=512 expecting 512 00:04:10.936 16:12:39 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:10.936 00:04:10.936 real 0m3.540s 00:04:10.936 user 0m1.262s 00:04:10.936 sys 0m2.309s 00:04:10.936 16:12:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:10.936 16:12:39 -- common/autotest_common.sh@10 -- # set +x 00:04:10.936 ************************************ 00:04:10.936 END TEST per_node_1G_alloc 00:04:10.936 ************************************ 00:04:10.936 16:12:39 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:10.936 16:12:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:10.936 16:12:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:10.936 16:12:39 -- common/autotest_common.sh@10 -- # set +x 00:04:10.936 ************************************ 00:04:10.936 START TEST even_2G_alloc 00:04:10.936 ************************************ 00:04:10.936 16:12:39 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:04:10.936 16:12:39 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:10.936 16:12:39 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:10.936 16:12:39 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:10.936 16:12:39 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:10.936 16:12:39 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:10.936 16:12:39 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:10.936 16:12:39 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:10.936 16:12:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:10.936 16:12:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:10.936 16:12:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:10.936 16:12:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:10.936 16:12:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:10.936 16:12:39 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:10.936 16:12:39 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:10.936 16:12:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:10.936 16:12:39 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:10.936 16:12:39 -- setup/hugepages.sh@83 -- # : 512 00:04:10.936 16:12:39 -- setup/hugepages.sh@84 -- # : 1 00:04:10.936 16:12:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:10.936 16:12:39 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:10.936 16:12:39 -- setup/hugepages.sh@83 -- # : 0 00:04:10.937 16:12:39 -- setup/hugepages.sh@84 -- # : 0 00:04:10.937 16:12:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:10.937 16:12:39 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:10.937 16:12:39 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:10.937 16:12:39 -- setup/hugepages.sh@153 -- # setup output 00:04:10.937 16:12:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:10.937 16:12:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:14.229 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.229 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:14.229 16:12:42 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:14.229 16:12:42 -- setup/hugepages.sh@89 -- # local node 00:04:14.229 16:12:42 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:14.229 16:12:42 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:14.229 16:12:42 -- setup/hugepages.sh@92 -- # local surp 00:04:14.229 16:12:42 -- setup/hugepages.sh@93 -- # local resv 00:04:14.229 16:12:42 -- setup/hugepages.sh@94 -- # local anon 00:04:14.229 16:12:42 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:14.229 16:12:42 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:14.229 16:12:42 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:14.229 16:12:42 -- setup/common.sh@18 -- # local node= 00:04:14.229 16:12:42 -- setup/common.sh@19 -- # local var val 00:04:14.229 16:12:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.229 16:12:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.229 16:12:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.229 16:12:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.229 16:12:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.229 16:12:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42501572 kB' 'MemAvailable: 44356864 kB' 'Buffers: 4300 kB' 'Cached: 11309924 kB' 'SwapCached: 28 kB' 'Active: 10455032 kB' 'Inactive: 1486324 kB' 'Active(anon): 9971776 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630412 kB' 'Mapped: 176184 kB' 'Shmem: 9376856 kB' 'KReclaimable: 565620 kB' 'Slab: 1211280 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645660 kB' 'KernelStack: 21936 kB' 'PageTables: 8548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11396772 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216484 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.229 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.229 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.230 16:12:42 -- setup/common.sh@33 -- # echo 0 00:04:14.230 16:12:42 -- setup/common.sh@33 -- # return 0 00:04:14.230 16:12:42 -- setup/hugepages.sh@97 -- # anon=0 00:04:14.230 16:12:42 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:14.230 16:12:42 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.230 16:12:42 -- setup/common.sh@18 -- # local node= 00:04:14.230 16:12:42 -- setup/common.sh@19 -- # local var val 00:04:14.230 16:12:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.230 16:12:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.230 16:12:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.230 16:12:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.230 16:12:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.230 16:12:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42503360 kB' 'MemAvailable: 44358652 kB' 'Buffers: 4300 kB' 'Cached: 11309928 kB' 'SwapCached: 28 kB' 'Active: 10454692 kB' 'Inactive: 1486324 kB' 'Active(anon): 9971436 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630116 kB' 'Mapped: 176104 kB' 'Shmem: 9376860 kB' 'KReclaimable: 565620 kB' 'Slab: 1211240 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645620 kB' 'KernelStack: 21888 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11396540 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216436 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.230 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.230 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.231 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.231 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.232 16:12:42 -- setup/common.sh@33 -- # echo 0 00:04:14.232 16:12:42 -- setup/common.sh@33 -- # return 0 00:04:14.232 16:12:42 -- setup/hugepages.sh@99 -- # surp=0 00:04:14.232 16:12:42 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:14.232 16:12:42 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:14.232 16:12:42 -- setup/common.sh@18 -- # local node= 00:04:14.232 16:12:42 -- setup/common.sh@19 -- # local var val 00:04:14.232 16:12:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.232 16:12:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.232 16:12:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.232 16:12:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.232 16:12:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.232 16:12:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42503232 kB' 'MemAvailable: 44358524 kB' 'Buffers: 4300 kB' 'Cached: 11309928 kB' 'SwapCached: 28 kB' 'Active: 10454684 kB' 'Inactive: 1486324 kB' 'Active(anon): 9971428 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630092 kB' 'Mapped: 176104 kB' 'Shmem: 9376860 kB' 'KReclaimable: 565620 kB' 'Slab: 1211300 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645680 kB' 'KernelStack: 21904 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11396552 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216436 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.232 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.232 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.233 16:12:42 -- setup/common.sh@33 -- # echo 0 00:04:14.233 16:12:42 -- setup/common.sh@33 -- # return 0 00:04:14.233 16:12:42 -- setup/hugepages.sh@100 -- # resv=0 00:04:14.233 16:12:42 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:14.233 nr_hugepages=1024 00:04:14.233 16:12:42 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:14.233 resv_hugepages=0 00:04:14.233 16:12:42 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:14.233 surplus_hugepages=0 00:04:14.233 16:12:42 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:14.233 anon_hugepages=0 00:04:14.233 16:12:42 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:14.233 16:12:42 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:14.233 16:12:42 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:14.233 16:12:42 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:14.233 16:12:42 -- setup/common.sh@18 -- # local node= 00:04:14.233 16:12:42 -- setup/common.sh@19 -- # local var val 00:04:14.233 16:12:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.233 16:12:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.233 16:12:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.233 16:12:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.233 16:12:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.233 16:12:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42503628 kB' 'MemAvailable: 44358920 kB' 'Buffers: 4300 kB' 'Cached: 11309956 kB' 'SwapCached: 28 kB' 'Active: 10454704 kB' 'Inactive: 1486324 kB' 'Active(anon): 9971448 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630092 kB' 'Mapped: 176104 kB' 'Shmem: 9376888 kB' 'KReclaimable: 565620 kB' 'Slab: 1211300 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645680 kB' 'KernelStack: 21904 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11396568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216436 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.233 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.233 16:12:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.234 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.234 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.235 16:12:42 -- setup/common.sh@33 -- # echo 1024 00:04:14.235 16:12:42 -- setup/common.sh@33 -- # return 0 00:04:14.235 16:12:42 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:14.235 16:12:42 -- setup/hugepages.sh@112 -- # get_nodes 00:04:14.235 16:12:42 -- setup/hugepages.sh@27 -- # local node 00:04:14.235 16:12:42 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.235 16:12:42 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:14.235 16:12:42 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.235 16:12:42 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:14.235 16:12:42 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:14.235 16:12:42 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:14.235 16:12:42 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.235 16:12:42 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.235 16:12:42 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:14.235 16:12:42 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.235 16:12:42 -- setup/common.sh@18 -- # local node=0 00:04:14.235 16:12:42 -- setup/common.sh@19 -- # local var val 00:04:14.235 16:12:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.235 16:12:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.235 16:12:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:14.235 16:12:42 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:14.235 16:12:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.235 16:12:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 24859804 kB' 'MemUsed: 7732280 kB' 'SwapCached: 16 kB' 'Active: 4522748 kB' 'Inactive: 338724 kB' 'Active(anon): 4304528 kB' 'Inactive(anon): 16 kB' 'Active(file): 218220 kB' 'Inactive(file): 338708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4477584 kB' 'Mapped: 109924 kB' 'AnonPages: 387088 kB' 'Shmem: 3920640 kB' 'KernelStack: 12808 kB' 'PageTables: 5000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 349476 kB' 'Slab: 641084 kB' 'SReclaimable: 349476 kB' 'SUnreclaim: 291608 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.235 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.235 16:12:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.236 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.236 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.237 16:12:42 -- setup/common.sh@33 -- # echo 0 00:04:14.237 16:12:42 -- setup/common.sh@33 -- # return 0 00:04:14.237 16:12:42 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:14.237 16:12:42 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.237 16:12:42 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.237 16:12:42 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:14.237 16:12:42 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.237 16:12:42 -- setup/common.sh@18 -- # local node=1 00:04:14.237 16:12:42 -- setup/common.sh@19 -- # local var val 00:04:14.237 16:12:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.237 16:12:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.237 16:12:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:14.237 16:12:42 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:14.237 16:12:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.237 16:12:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.237 16:12:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17644504 kB' 'MemUsed: 10058644 kB' 'SwapCached: 12 kB' 'Active: 5931940 kB' 'Inactive: 1147600 kB' 'Active(anon): 5666904 kB' 'Inactive(anon): 32196 kB' 'Active(file): 265036 kB' 'Inactive(file): 1115404 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6836716 kB' 'Mapped: 66180 kB' 'AnonPages: 242972 kB' 'Shmem: 5456264 kB' 'KernelStack: 9080 kB' 'PageTables: 3412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 216144 kB' 'Slab: 570216 kB' 'SReclaimable: 216144 kB' 'SUnreclaim: 354072 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.237 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.237 16:12:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.238 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.238 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # continue 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.239 16:12:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.239 16:12:42 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.239 16:12:42 -- setup/common.sh@33 -- # echo 0 00:04:14.239 16:12:42 -- setup/common.sh@33 -- # return 0 00:04:14.239 16:12:42 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:14.239 16:12:42 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:14.239 16:12:42 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:14.239 16:12:42 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:14.239 16:12:42 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:14.239 node0=512 expecting 512 00:04:14.239 16:12:42 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:14.239 16:12:42 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:14.239 16:12:42 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:14.239 16:12:42 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:14.239 node1=512 expecting 512 00:04:14.239 16:12:42 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:14.239 00:04:14.239 real 0m3.340s 00:04:14.239 user 0m1.190s 00:04:14.239 sys 0m2.179s 00:04:14.239 16:12:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:14.239 16:12:42 -- common/autotest_common.sh@10 -- # set +x 00:04:14.239 ************************************ 00:04:14.239 END TEST even_2G_alloc 00:04:14.239 ************************************ 00:04:14.239 16:12:42 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:14.239 16:12:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:14.239 16:12:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:14.239 16:12:42 -- common/autotest_common.sh@10 -- # set +x 00:04:14.239 ************************************ 00:04:14.239 START TEST odd_alloc 00:04:14.239 ************************************ 00:04:14.239 16:12:42 -- common/autotest_common.sh@1104 -- # odd_alloc 00:04:14.239 16:12:42 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:14.239 16:12:42 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:14.239 16:12:42 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:14.239 16:12:42 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:14.239 16:12:42 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:14.239 16:12:42 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:14.239 16:12:42 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:14.239 16:12:42 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:14.239 16:12:42 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:14.239 16:12:42 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:14.239 16:12:42 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:14.239 16:12:42 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:14.239 16:12:42 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:14.239 16:12:42 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:14.239 16:12:42 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:14.239 16:12:42 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:14.239 16:12:42 -- setup/hugepages.sh@83 -- # : 513 00:04:14.239 16:12:42 -- setup/hugepages.sh@84 -- # : 1 00:04:14.239 16:12:42 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:14.239 16:12:42 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:14.239 16:12:42 -- setup/hugepages.sh@83 -- # : 0 00:04:14.239 16:12:42 -- setup/hugepages.sh@84 -- # : 0 00:04:14.239 16:12:42 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:14.239 16:12:42 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:14.239 16:12:42 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:14.239 16:12:42 -- setup/hugepages.sh@160 -- # setup output 00:04:14.239 16:12:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:14.239 16:12:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:17.526 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.526 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:17.526 16:12:46 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:17.526 16:12:46 -- setup/hugepages.sh@89 -- # local node 00:04:17.526 16:12:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:17.526 16:12:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:17.526 16:12:46 -- setup/hugepages.sh@92 -- # local surp 00:04:17.526 16:12:46 -- setup/hugepages.sh@93 -- # local resv 00:04:17.526 16:12:46 -- setup/hugepages.sh@94 -- # local anon 00:04:17.526 16:12:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:17.526 16:12:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:17.526 16:12:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:17.526 16:12:46 -- setup/common.sh@18 -- # local node= 00:04:17.526 16:12:46 -- setup/common.sh@19 -- # local var val 00:04:17.526 16:12:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.526 16:12:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.526 16:12:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.526 16:12:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.526 16:12:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.526 16:12:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42533372 kB' 'MemAvailable: 44388664 kB' 'Buffers: 4300 kB' 'Cached: 11310052 kB' 'SwapCached: 28 kB' 'Active: 10455752 kB' 'Inactive: 1486324 kB' 'Active(anon): 9972496 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631008 kB' 'Mapped: 176200 kB' 'Shmem: 9376984 kB' 'KReclaimable: 565620 kB' 'Slab: 1211228 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645608 kB' 'KernelStack: 21904 kB' 'PageTables: 8452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 11397172 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216404 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.526 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.526 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.527 16:12:46 -- setup/common.sh@33 -- # echo 0 00:04:17.527 16:12:46 -- setup/common.sh@33 -- # return 0 00:04:17.527 16:12:46 -- setup/hugepages.sh@97 -- # anon=0 00:04:17.527 16:12:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:17.527 16:12:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.527 16:12:46 -- setup/common.sh@18 -- # local node= 00:04:17.527 16:12:46 -- setup/common.sh@19 -- # local var val 00:04:17.527 16:12:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.527 16:12:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.527 16:12:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.527 16:12:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.527 16:12:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.527 16:12:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.527 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.527 16:12:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42536276 kB' 'MemAvailable: 44391568 kB' 'Buffers: 4300 kB' 'Cached: 11310052 kB' 'SwapCached: 28 kB' 'Active: 10456088 kB' 'Inactive: 1486324 kB' 'Active(anon): 9972832 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631464 kB' 'Mapped: 176164 kB' 'Shmem: 9376984 kB' 'KReclaimable: 565620 kB' 'Slab: 1211228 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645608 kB' 'KernelStack: 21920 kB' 'PageTables: 8512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 11400256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216324 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.527 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.789 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.789 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.790 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.790 16:12:46 -- setup/common.sh@33 -- # echo 0 00:04:17.790 16:12:46 -- setup/common.sh@33 -- # return 0 00:04:17.790 16:12:46 -- setup/hugepages.sh@99 -- # surp=0 00:04:17.790 16:12:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:17.790 16:12:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:17.790 16:12:46 -- setup/common.sh@18 -- # local node= 00:04:17.790 16:12:46 -- setup/common.sh@19 -- # local var val 00:04:17.790 16:12:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.790 16:12:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.790 16:12:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.790 16:12:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.790 16:12:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.790 16:12:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.790 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42535764 kB' 'MemAvailable: 44391056 kB' 'Buffers: 4300 kB' 'Cached: 11310064 kB' 'SwapCached: 28 kB' 'Active: 10455984 kB' 'Inactive: 1486324 kB' 'Active(anon): 9972728 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631308 kB' 'Mapped: 176116 kB' 'Shmem: 9376996 kB' 'KReclaimable: 565620 kB' 'Slab: 1211268 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645648 kB' 'KernelStack: 21984 kB' 'PageTables: 8312 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 11400248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216340 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.791 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.791 16:12:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.792 16:12:46 -- setup/common.sh@33 -- # echo 0 00:04:17.792 16:12:46 -- setup/common.sh@33 -- # return 0 00:04:17.792 16:12:46 -- setup/hugepages.sh@100 -- # resv=0 00:04:17.792 16:12:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:17.792 nr_hugepages=1025 00:04:17.792 16:12:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:17.792 resv_hugepages=0 00:04:17.792 16:12:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:17.792 surplus_hugepages=0 00:04:17.792 16:12:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:17.792 anon_hugepages=0 00:04:17.792 16:12:46 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:17.792 16:12:46 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:17.792 16:12:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:17.792 16:12:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:17.792 16:12:46 -- setup/common.sh@18 -- # local node= 00:04:17.792 16:12:46 -- setup/common.sh@19 -- # local var val 00:04:17.792 16:12:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.792 16:12:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.792 16:12:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.792 16:12:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.792 16:12:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.792 16:12:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42534756 kB' 'MemAvailable: 44390048 kB' 'Buffers: 4300 kB' 'Cached: 11310064 kB' 'SwapCached: 28 kB' 'Active: 10455596 kB' 'Inactive: 1486324 kB' 'Active(anon): 9972340 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630972 kB' 'Mapped: 176116 kB' 'Shmem: 9376996 kB' 'KReclaimable: 565620 kB' 'Slab: 1211268 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645648 kB' 'KernelStack: 21888 kB' 'PageTables: 8544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 11401620 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216356 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.792 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.792 16:12:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.793 16:12:46 -- setup/common.sh@33 -- # echo 1025 00:04:17.793 16:12:46 -- setup/common.sh@33 -- # return 0 00:04:17.793 16:12:46 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:17.793 16:12:46 -- setup/hugepages.sh@112 -- # get_nodes 00:04:17.793 16:12:46 -- setup/hugepages.sh@27 -- # local node 00:04:17.793 16:12:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.793 16:12:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:17.793 16:12:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.793 16:12:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:17.793 16:12:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:17.793 16:12:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:17.793 16:12:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.793 16:12:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.793 16:12:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:17.793 16:12:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.793 16:12:46 -- setup/common.sh@18 -- # local node=0 00:04:17.793 16:12:46 -- setup/common.sh@19 -- # local var val 00:04:17.793 16:12:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.793 16:12:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.793 16:12:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:17.793 16:12:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:17.793 16:12:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.793 16:12:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.793 16:12:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 24891336 kB' 'MemUsed: 7700748 kB' 'SwapCached: 16 kB' 'Active: 4523868 kB' 'Inactive: 338724 kB' 'Active(anon): 4305648 kB' 'Inactive(anon): 16 kB' 'Active(file): 218220 kB' 'Inactive(file): 338708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4477596 kB' 'Mapped: 109936 kB' 'AnonPages: 388188 kB' 'Shmem: 3920652 kB' 'KernelStack: 12952 kB' 'PageTables: 5360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 349476 kB' 'Slab: 641216 kB' 'SReclaimable: 349476 kB' 'SUnreclaim: 291740 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.793 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.793 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.794 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.794 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@33 -- # echo 0 00:04:17.795 16:12:46 -- setup/common.sh@33 -- # return 0 00:04:17.795 16:12:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.795 16:12:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.795 16:12:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.795 16:12:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:17.795 16:12:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.795 16:12:46 -- setup/common.sh@18 -- # local node=1 00:04:17.795 16:12:46 -- setup/common.sh@19 -- # local var val 00:04:17.795 16:12:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.795 16:12:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.795 16:12:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:17.795 16:12:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:17.795 16:12:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.795 16:12:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17644720 kB' 'MemUsed: 10058428 kB' 'SwapCached: 12 kB' 'Active: 5932468 kB' 'Inactive: 1147600 kB' 'Active(anon): 5667432 kB' 'Inactive(anon): 32196 kB' 'Active(file): 265036 kB' 'Inactive(file): 1115404 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6836844 kB' 'Mapped: 66180 kB' 'AnonPages: 243376 kB' 'Shmem: 5456392 kB' 'KernelStack: 9064 kB' 'PageTables: 3408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 216144 kB' 'Slab: 570052 kB' 'SReclaimable: 216144 kB' 'SUnreclaim: 353908 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.795 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.795 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # continue 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.796 16:12:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.796 16:12:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.796 16:12:46 -- setup/common.sh@33 -- # echo 0 00:04:17.796 16:12:46 -- setup/common.sh@33 -- # return 0 00:04:17.796 16:12:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.796 16:12:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.796 16:12:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.796 16:12:46 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:17.796 node0=512 expecting 513 00:04:17.796 16:12:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.796 16:12:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.796 16:12:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.796 16:12:46 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:17.796 node1=513 expecting 512 00:04:17.796 16:12:46 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:17.796 00:04:17.796 real 0m3.509s 00:04:17.796 user 0m1.269s 00:04:17.796 sys 0m2.252s 00:04:17.796 16:12:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:17.796 16:12:46 -- common/autotest_common.sh@10 -- # set +x 00:04:17.796 ************************************ 00:04:17.796 END TEST odd_alloc 00:04:17.796 ************************************ 00:04:17.796 16:12:46 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:17.796 16:12:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:17.796 16:12:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:17.796 16:12:46 -- common/autotest_common.sh@10 -- # set +x 00:04:17.796 ************************************ 00:04:17.796 START TEST custom_alloc 00:04:17.796 ************************************ 00:04:17.796 16:12:46 -- common/autotest_common.sh@1104 -- # custom_alloc 00:04:17.796 16:12:46 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:17.796 16:12:46 -- setup/hugepages.sh@169 -- # local node 00:04:17.796 16:12:46 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:17.796 16:12:46 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:17.796 16:12:46 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:17.796 16:12:46 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:17.796 16:12:46 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:17.796 16:12:46 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:17.796 16:12:46 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:17.796 16:12:46 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:17.796 16:12:46 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:17.796 16:12:46 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:17.796 16:12:46 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:17.796 16:12:46 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:17.796 16:12:46 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:17.796 16:12:46 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:17.796 16:12:46 -- setup/hugepages.sh@83 -- # : 256 00:04:17.796 16:12:46 -- setup/hugepages.sh@84 -- # : 1 00:04:17.796 16:12:46 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:17.796 16:12:46 -- setup/hugepages.sh@83 -- # : 0 00:04:17.796 16:12:46 -- setup/hugepages.sh@84 -- # : 0 00:04:17.796 16:12:46 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:17.796 16:12:46 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:17.796 16:12:46 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:17.796 16:12:46 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:17.796 16:12:46 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:17.796 16:12:46 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:17.796 16:12:46 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:17.796 16:12:46 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:17.796 16:12:46 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:17.796 16:12:46 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:17.796 16:12:46 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:17.796 16:12:46 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:17.796 16:12:46 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:17.796 16:12:46 -- setup/hugepages.sh@78 -- # return 0 00:04:17.796 16:12:46 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:17.796 16:12:46 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:17.796 16:12:46 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:17.796 16:12:46 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:17.796 16:12:46 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:17.796 16:12:46 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:17.796 16:12:46 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:17.796 16:12:46 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:17.796 16:12:46 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:17.796 16:12:46 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:17.796 16:12:46 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:17.796 16:12:46 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:17.796 16:12:46 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:17.796 16:12:46 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:17.796 16:12:46 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:17.796 16:12:46 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:17.796 16:12:46 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:17.796 16:12:46 -- setup/hugepages.sh@78 -- # return 0 00:04:17.796 16:12:46 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:17.796 16:12:46 -- setup/hugepages.sh@187 -- # setup output 00:04:17.796 16:12:46 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.796 16:12:46 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:21.083 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.083 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:21.083 16:12:49 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:21.083 16:12:49 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:21.083 16:12:49 -- setup/hugepages.sh@89 -- # local node 00:04:21.083 16:12:49 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:21.083 16:12:49 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:21.083 16:12:49 -- setup/hugepages.sh@92 -- # local surp 00:04:21.083 16:12:49 -- setup/hugepages.sh@93 -- # local resv 00:04:21.083 16:12:49 -- setup/hugepages.sh@94 -- # local anon 00:04:21.083 16:12:49 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:21.083 16:12:49 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:21.083 16:12:49 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:21.083 16:12:49 -- setup/common.sh@18 -- # local node= 00:04:21.083 16:12:49 -- setup/common.sh@19 -- # local var val 00:04:21.083 16:12:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.083 16:12:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.083 16:12:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.083 16:12:49 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.083 16:12:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.083 16:12:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41509828 kB' 'MemAvailable: 43365120 kB' 'Buffers: 4300 kB' 'Cached: 11310196 kB' 'SwapCached: 28 kB' 'Active: 10458396 kB' 'Inactive: 1486324 kB' 'Active(anon): 9975140 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633612 kB' 'Mapped: 176208 kB' 'Shmem: 9377128 kB' 'KReclaimable: 565620 kB' 'Slab: 1211020 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645400 kB' 'KernelStack: 21952 kB' 'PageTables: 8868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 11400964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216580 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.083 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.083 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.084 16:12:49 -- setup/common.sh@33 -- # echo 0 00:04:21.084 16:12:49 -- setup/common.sh@33 -- # return 0 00:04:21.084 16:12:49 -- setup/hugepages.sh@97 -- # anon=0 00:04:21.084 16:12:49 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:21.084 16:12:49 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.084 16:12:49 -- setup/common.sh@18 -- # local node= 00:04:21.084 16:12:49 -- setup/common.sh@19 -- # local var val 00:04:21.084 16:12:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.084 16:12:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.084 16:12:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.084 16:12:49 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.084 16:12:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.084 16:12:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41512476 kB' 'MemAvailable: 43367768 kB' 'Buffers: 4300 kB' 'Cached: 11310196 kB' 'SwapCached: 28 kB' 'Active: 10458224 kB' 'Inactive: 1486324 kB' 'Active(anon): 9974968 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632852 kB' 'Mapped: 176196 kB' 'Shmem: 9377128 kB' 'KReclaimable: 565620 kB' 'Slab: 1210944 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645324 kB' 'KernelStack: 22112 kB' 'PageTables: 8992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 11400976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216644 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.084 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.084 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.085 16:12:49 -- setup/common.sh@33 -- # echo 0 00:04:21.085 16:12:49 -- setup/common.sh@33 -- # return 0 00:04:21.085 16:12:49 -- setup/hugepages.sh@99 -- # surp=0 00:04:21.085 16:12:49 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:21.085 16:12:49 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:21.085 16:12:49 -- setup/common.sh@18 -- # local node= 00:04:21.085 16:12:49 -- setup/common.sh@19 -- # local var val 00:04:21.085 16:12:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.085 16:12:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.085 16:12:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.085 16:12:49 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.085 16:12:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.085 16:12:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41517676 kB' 'MemAvailable: 43372968 kB' 'Buffers: 4300 kB' 'Cached: 11310208 kB' 'SwapCached: 28 kB' 'Active: 10457168 kB' 'Inactive: 1486324 kB' 'Active(anon): 9973912 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632212 kB' 'Mapped: 176120 kB' 'Shmem: 9377140 kB' 'KReclaimable: 565620 kB' 'Slab: 1210736 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645116 kB' 'KernelStack: 22192 kB' 'PageTables: 9560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 11402496 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216628 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.085 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.085 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.086 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.086 16:12:49 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.087 16:12:49 -- setup/common.sh@33 -- # echo 0 00:04:21.087 16:12:49 -- setup/common.sh@33 -- # return 0 00:04:21.087 16:12:49 -- setup/hugepages.sh@100 -- # resv=0 00:04:21.087 16:12:49 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:21.087 nr_hugepages=1536 00:04:21.087 16:12:49 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:21.087 resv_hugepages=0 00:04:21.087 16:12:49 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:21.087 surplus_hugepages=0 00:04:21.087 16:12:49 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:21.087 anon_hugepages=0 00:04:21.087 16:12:49 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:21.087 16:12:49 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:21.087 16:12:49 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:21.087 16:12:49 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:21.087 16:12:49 -- setup/common.sh@18 -- # local node= 00:04:21.087 16:12:49 -- setup/common.sh@19 -- # local var val 00:04:21.087 16:12:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.087 16:12:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.087 16:12:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.087 16:12:49 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.087 16:12:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.087 16:12:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41519120 kB' 'MemAvailable: 43374412 kB' 'Buffers: 4300 kB' 'Cached: 11310224 kB' 'SwapCached: 28 kB' 'Active: 10457328 kB' 'Inactive: 1486324 kB' 'Active(anon): 9974072 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632348 kB' 'Mapped: 176120 kB' 'Shmem: 9377156 kB' 'KReclaimable: 565620 kB' 'Slab: 1210576 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 644956 kB' 'KernelStack: 22192 kB' 'PageTables: 9416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 11401004 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216628 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.087 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.087 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.088 16:12:49 -- setup/common.sh@33 -- # echo 1536 00:04:21.088 16:12:49 -- setup/common.sh@33 -- # return 0 00:04:21.088 16:12:49 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:21.088 16:12:49 -- setup/hugepages.sh@112 -- # get_nodes 00:04:21.088 16:12:49 -- setup/hugepages.sh@27 -- # local node 00:04:21.088 16:12:49 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.088 16:12:49 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:21.088 16:12:49 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.088 16:12:49 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:21.088 16:12:49 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:21.088 16:12:49 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:21.088 16:12:49 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:21.088 16:12:49 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:21.088 16:12:49 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:21.088 16:12:49 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.088 16:12:49 -- setup/common.sh@18 -- # local node=0 00:04:21.088 16:12:49 -- setup/common.sh@19 -- # local var val 00:04:21.088 16:12:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.088 16:12:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.088 16:12:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:21.088 16:12:49 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:21.088 16:12:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.088 16:12:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 24913924 kB' 'MemUsed: 7678160 kB' 'SwapCached: 16 kB' 'Active: 4522788 kB' 'Inactive: 338724 kB' 'Active(anon): 4304568 kB' 'Inactive(anon): 16 kB' 'Active(file): 218220 kB' 'Inactive(file): 338708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4477608 kB' 'Mapped: 109940 kB' 'AnonPages: 387480 kB' 'Shmem: 3920664 kB' 'KernelStack: 12984 kB' 'PageTables: 5692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 349476 kB' 'Slab: 640716 kB' 'SReclaimable: 349476 kB' 'SUnreclaim: 291240 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.088 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.088 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@33 -- # echo 0 00:04:21.089 16:12:49 -- setup/common.sh@33 -- # return 0 00:04:21.089 16:12:49 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.089 16:12:49 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:21.089 16:12:49 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:21.089 16:12:49 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:21.089 16:12:49 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.089 16:12:49 -- setup/common.sh@18 -- # local node=1 00:04:21.089 16:12:49 -- setup/common.sh@19 -- # local var val 00:04:21.089 16:12:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.089 16:12:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.089 16:12:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:21.089 16:12:49 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:21.089 16:12:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.089 16:12:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16602436 kB' 'MemUsed: 11100712 kB' 'SwapCached: 12 kB' 'Active: 5934072 kB' 'Inactive: 1147600 kB' 'Active(anon): 5669036 kB' 'Inactive(anon): 32196 kB' 'Active(file): 265036 kB' 'Inactive(file): 1115404 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6836968 kB' 'Mapped: 66180 kB' 'AnonPages: 244848 kB' 'Shmem: 5456516 kB' 'KernelStack: 9080 kB' 'PageTables: 3584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 216144 kB' 'Slab: 569828 kB' 'SReclaimable: 216144 kB' 'SUnreclaim: 353684 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.089 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.089 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # continue 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.090 16:12:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.090 16:12:49 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.090 16:12:49 -- setup/common.sh@33 -- # echo 0 00:04:21.349 16:12:49 -- setup/common.sh@33 -- # return 0 00:04:21.349 16:12:49 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.349 16:12:49 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.349 16:12:49 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.349 16:12:49 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.349 16:12:49 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:21.349 node0=512 expecting 512 00:04:21.349 16:12:49 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.349 16:12:49 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.349 16:12:49 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.349 16:12:49 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:21.349 node1=1024 expecting 1024 00:04:21.349 16:12:49 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:21.349 00:04:21.349 real 0m3.375s 00:04:21.349 user 0m1.253s 00:04:21.349 sys 0m2.153s 00:04:21.349 16:12:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:21.349 16:12:49 -- common/autotest_common.sh@10 -- # set +x 00:04:21.349 ************************************ 00:04:21.349 END TEST custom_alloc 00:04:21.349 ************************************ 00:04:21.349 16:12:49 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:21.349 16:12:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:21.349 16:12:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:21.349 16:12:49 -- common/autotest_common.sh@10 -- # set +x 00:04:21.349 ************************************ 00:04:21.349 START TEST no_shrink_alloc 00:04:21.349 ************************************ 00:04:21.349 16:12:49 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:04:21.349 16:12:49 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:21.349 16:12:49 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:21.349 16:12:49 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:21.349 16:12:49 -- setup/hugepages.sh@51 -- # shift 00:04:21.349 16:12:49 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:21.349 16:12:49 -- setup/hugepages.sh@52 -- # local node_ids 00:04:21.349 16:12:49 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:21.349 16:12:49 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:21.349 16:12:49 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:21.349 16:12:49 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:21.349 16:12:49 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:21.349 16:12:49 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:21.349 16:12:49 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:21.349 16:12:49 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:21.349 16:12:49 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:21.349 16:12:49 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:21.349 16:12:49 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:21.349 16:12:49 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:21.349 16:12:49 -- setup/hugepages.sh@73 -- # return 0 00:04:21.349 16:12:49 -- setup/hugepages.sh@198 -- # setup output 00:04:21.349 16:12:49 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.349 16:12:49 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:24.634 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:24.635 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:24.635 16:12:53 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:24.635 16:12:53 -- setup/hugepages.sh@89 -- # local node 00:04:24.635 16:12:53 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:24.635 16:12:53 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:24.635 16:12:53 -- setup/hugepages.sh@92 -- # local surp 00:04:24.635 16:12:53 -- setup/hugepages.sh@93 -- # local resv 00:04:24.635 16:12:53 -- setup/hugepages.sh@94 -- # local anon 00:04:24.635 16:12:53 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:24.635 16:12:53 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:24.635 16:12:53 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:24.635 16:12:53 -- setup/common.sh@18 -- # local node= 00:04:24.635 16:12:53 -- setup/common.sh@19 -- # local var val 00:04:24.635 16:12:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.635 16:12:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.635 16:12:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.635 16:12:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.635 16:12:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.635 16:12:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42586144 kB' 'MemAvailable: 44441436 kB' 'Buffers: 4300 kB' 'Cached: 11310320 kB' 'SwapCached: 28 kB' 'Active: 10458572 kB' 'Inactive: 1486324 kB' 'Active(anon): 9975316 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632936 kB' 'Mapped: 176168 kB' 'Shmem: 9377252 kB' 'KReclaimable: 565620 kB' 'Slab: 1210604 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 644984 kB' 'KernelStack: 21920 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11398588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216500 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.635 16:12:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.635 16:12:53 -- setup/common.sh@33 -- # echo 0 00:04:24.635 16:12:53 -- setup/common.sh@33 -- # return 0 00:04:24.635 16:12:53 -- setup/hugepages.sh@97 -- # anon=0 00:04:24.635 16:12:53 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:24.635 16:12:53 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.635 16:12:53 -- setup/common.sh@18 -- # local node= 00:04:24.635 16:12:53 -- setup/common.sh@19 -- # local var val 00:04:24.635 16:12:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.635 16:12:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.635 16:12:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.635 16:12:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.635 16:12:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.635 16:12:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.635 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42586584 kB' 'MemAvailable: 44441876 kB' 'Buffers: 4300 kB' 'Cached: 11310324 kB' 'SwapCached: 28 kB' 'Active: 10457788 kB' 'Inactive: 1486324 kB' 'Active(anon): 9974532 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632688 kB' 'Mapped: 176132 kB' 'Shmem: 9377256 kB' 'KReclaimable: 565620 kB' 'Slab: 1210656 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645036 kB' 'KernelStack: 21904 kB' 'PageTables: 8452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11398600 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216468 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.636 16:12:53 -- setup/common.sh@33 -- # echo 0 00:04:24.636 16:12:53 -- setup/common.sh@33 -- # return 0 00:04:24.636 16:12:53 -- setup/hugepages.sh@99 -- # surp=0 00:04:24.636 16:12:53 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:24.636 16:12:53 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:24.636 16:12:53 -- setup/common.sh@18 -- # local node= 00:04:24.636 16:12:53 -- setup/common.sh@19 -- # local var val 00:04:24.636 16:12:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.636 16:12:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.636 16:12:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.636 16:12:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.636 16:12:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.636 16:12:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42586584 kB' 'MemAvailable: 44441876 kB' 'Buffers: 4300 kB' 'Cached: 11310324 kB' 'SwapCached: 28 kB' 'Active: 10457828 kB' 'Inactive: 1486324 kB' 'Active(anon): 9974572 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632728 kB' 'Mapped: 176132 kB' 'Shmem: 9377256 kB' 'KReclaimable: 565620 kB' 'Slab: 1210656 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645036 kB' 'KernelStack: 21920 kB' 'PageTables: 8496 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11398616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216468 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.636 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.636 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.637 16:12:53 -- setup/common.sh@33 -- # echo 0 00:04:24.637 16:12:53 -- setup/common.sh@33 -- # return 0 00:04:24.637 16:12:53 -- setup/hugepages.sh@100 -- # resv=0 00:04:24.637 16:12:53 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:24.637 nr_hugepages=1024 00:04:24.637 16:12:53 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:24.637 resv_hugepages=0 00:04:24.637 16:12:53 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:24.637 surplus_hugepages=0 00:04:24.637 16:12:53 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:24.637 anon_hugepages=0 00:04:24.637 16:12:53 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.637 16:12:53 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:24.637 16:12:53 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:24.637 16:12:53 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:24.637 16:12:53 -- setup/common.sh@18 -- # local node= 00:04:24.637 16:12:53 -- setup/common.sh@19 -- # local var val 00:04:24.637 16:12:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.637 16:12:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.637 16:12:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.637 16:12:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.637 16:12:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.637 16:12:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.637 16:12:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42586416 kB' 'MemAvailable: 44441708 kB' 'Buffers: 4300 kB' 'Cached: 11310360 kB' 'SwapCached: 28 kB' 'Active: 10457464 kB' 'Inactive: 1486324 kB' 'Active(anon): 9974208 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632284 kB' 'Mapped: 176132 kB' 'Shmem: 9377292 kB' 'KReclaimable: 565620 kB' 'Slab: 1210656 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645036 kB' 'KernelStack: 21888 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11398628 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216468 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.637 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.637 16:12:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.638 16:12:53 -- setup/common.sh@33 -- # echo 1024 00:04:24.638 16:12:53 -- setup/common.sh@33 -- # return 0 00:04:24.638 16:12:53 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.638 16:12:53 -- setup/hugepages.sh@112 -- # get_nodes 00:04:24.638 16:12:53 -- setup/hugepages.sh@27 -- # local node 00:04:24.638 16:12:53 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.638 16:12:53 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:24.638 16:12:53 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.638 16:12:53 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:24.638 16:12:53 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:24.638 16:12:53 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:24.638 16:12:53 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.638 16:12:53 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.638 16:12:53 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:24.638 16:12:53 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.638 16:12:53 -- setup/common.sh@18 -- # local node=0 00:04:24.638 16:12:53 -- setup/common.sh@19 -- # local var val 00:04:24.638 16:12:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.638 16:12:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.638 16:12:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:24.638 16:12:53 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:24.638 16:12:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.638 16:12:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23879764 kB' 'MemUsed: 8712320 kB' 'SwapCached: 16 kB' 'Active: 4523064 kB' 'Inactive: 338724 kB' 'Active(anon): 4304844 kB' 'Inactive(anon): 16 kB' 'Active(file): 218220 kB' 'Inactive(file): 338708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4477672 kB' 'Mapped: 109948 kB' 'AnonPages: 387224 kB' 'Shmem: 3920728 kB' 'KernelStack: 12792 kB' 'PageTables: 4900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 349476 kB' 'Slab: 640928 kB' 'SReclaimable: 349476 kB' 'SUnreclaim: 291452 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.638 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.638 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # continue 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.639 16:12:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.639 16:12:53 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.639 16:12:53 -- setup/common.sh@33 -- # echo 0 00:04:24.639 16:12:53 -- setup/common.sh@33 -- # return 0 00:04:24.639 16:12:53 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.639 16:12:53 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.639 16:12:53 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.639 16:12:53 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.639 16:12:53 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:24.639 node0=1024 expecting 1024 00:04:24.639 16:12:53 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:24.639 16:12:53 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:24.639 16:12:53 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:24.639 16:12:53 -- setup/hugepages.sh@202 -- # setup output 00:04:24.639 16:12:53 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.639 16:12:53 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:27.931 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:27.931 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:27.931 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:27.931 16:12:56 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:27.931 16:12:56 -- setup/hugepages.sh@89 -- # local node 00:04:27.931 16:12:56 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:27.931 16:12:56 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:27.931 16:12:56 -- setup/hugepages.sh@92 -- # local surp 00:04:27.931 16:12:56 -- setup/hugepages.sh@93 -- # local resv 00:04:27.931 16:12:56 -- setup/hugepages.sh@94 -- # local anon 00:04:27.931 16:12:56 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:27.931 16:12:56 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:27.931 16:12:56 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:27.931 16:12:56 -- setup/common.sh@18 -- # local node= 00:04:27.931 16:12:56 -- setup/common.sh@19 -- # local var val 00:04:27.931 16:12:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.931 16:12:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.931 16:12:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.931 16:12:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.931 16:12:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.931 16:12:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42603584 kB' 'MemAvailable: 44458876 kB' 'Buffers: 4300 kB' 'Cached: 11310436 kB' 'SwapCached: 28 kB' 'Active: 10459048 kB' 'Inactive: 1486324 kB' 'Active(anon): 9975792 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633336 kB' 'Mapped: 176228 kB' 'Shmem: 9377368 kB' 'KReclaimable: 565620 kB' 'Slab: 1210756 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645136 kB' 'KernelStack: 21904 kB' 'PageTables: 8452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11399232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216500 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.931 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.931 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.932 16:12:56 -- setup/common.sh@33 -- # echo 0 00:04:27.932 16:12:56 -- setup/common.sh@33 -- # return 0 00:04:27.932 16:12:56 -- setup/hugepages.sh@97 -- # anon=0 00:04:27.932 16:12:56 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:27.932 16:12:56 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.932 16:12:56 -- setup/common.sh@18 -- # local node= 00:04:27.932 16:12:56 -- setup/common.sh@19 -- # local var val 00:04:27.932 16:12:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.932 16:12:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.932 16:12:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.932 16:12:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.932 16:12:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.932 16:12:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42603448 kB' 'MemAvailable: 44458740 kB' 'Buffers: 4300 kB' 'Cached: 11310440 kB' 'SwapCached: 28 kB' 'Active: 10458488 kB' 'Inactive: 1486324 kB' 'Active(anon): 9975232 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633292 kB' 'Mapped: 176136 kB' 'Shmem: 9377372 kB' 'KReclaimable: 565620 kB' 'Slab: 1210744 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645124 kB' 'KernelStack: 21936 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11399244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216516 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.932 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.932 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.933 16:12:56 -- setup/common.sh@33 -- # echo 0 00:04:27.933 16:12:56 -- setup/common.sh@33 -- # return 0 00:04:27.933 16:12:56 -- setup/hugepages.sh@99 -- # surp=0 00:04:27.933 16:12:56 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:27.933 16:12:56 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:27.933 16:12:56 -- setup/common.sh@18 -- # local node= 00:04:27.933 16:12:56 -- setup/common.sh@19 -- # local var val 00:04:27.933 16:12:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.933 16:12:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.933 16:12:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.933 16:12:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.933 16:12:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.933 16:12:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42606000 kB' 'MemAvailable: 44461292 kB' 'Buffers: 4300 kB' 'Cached: 11310448 kB' 'SwapCached: 28 kB' 'Active: 10458460 kB' 'Inactive: 1486324 kB' 'Active(anon): 9975204 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633292 kB' 'Mapped: 176136 kB' 'Shmem: 9377380 kB' 'KReclaimable: 565620 kB' 'Slab: 1210740 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645120 kB' 'KernelStack: 21936 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11399260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216516 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.933 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.933 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.934 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.934 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.935 16:12:56 -- setup/common.sh@33 -- # echo 0 00:04:27.935 16:12:56 -- setup/common.sh@33 -- # return 0 00:04:27.935 16:12:56 -- setup/hugepages.sh@100 -- # resv=0 00:04:27.935 16:12:56 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:27.935 nr_hugepages=1024 00:04:27.935 16:12:56 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:27.935 resv_hugepages=0 00:04:27.935 16:12:56 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:27.935 surplus_hugepages=0 00:04:27.935 16:12:56 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:27.935 anon_hugepages=0 00:04:27.935 16:12:56 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:27.935 16:12:56 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:27.935 16:12:56 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:27.935 16:12:56 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:27.935 16:12:56 -- setup/common.sh@18 -- # local node= 00:04:27.935 16:12:56 -- setup/common.sh@19 -- # local var val 00:04:27.935 16:12:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.935 16:12:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.935 16:12:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.935 16:12:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.935 16:12:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.935 16:12:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42606612 kB' 'MemAvailable: 44461904 kB' 'Buffers: 4300 kB' 'Cached: 11310464 kB' 'SwapCached: 28 kB' 'Active: 10458472 kB' 'Inactive: 1486324 kB' 'Active(anon): 9975216 kB' 'Inactive(anon): 32212 kB' 'Active(file): 483256 kB' 'Inactive(file): 1454112 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633292 kB' 'Mapped: 176136 kB' 'Shmem: 9377396 kB' 'KReclaimable: 565620 kB' 'Slab: 1210740 kB' 'SReclaimable: 565620 kB' 'SUnreclaim: 645120 kB' 'KernelStack: 21936 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 11399272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216516 kB' 'VmallocChunk: 0 kB' 'Percpu: 104832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3177844 kB' 'DirectMap2M: 52082688 kB' 'DirectMap1G: 13631488 kB' 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.935 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.935 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # continue 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.936 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.936 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.936 16:12:56 -- setup/common.sh@33 -- # echo 1024 00:04:27.936 16:12:56 -- setup/common.sh@33 -- # return 0 00:04:27.936 16:12:56 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:27.936 16:12:56 -- setup/hugepages.sh@112 -- # get_nodes 00:04:27.936 16:12:56 -- setup/hugepages.sh@27 -- # local node 00:04:27.936 16:12:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.936 16:12:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:27.936 16:12:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.936 16:12:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:27.936 16:12:56 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:27.936 16:12:56 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:27.936 16:12:56 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:27.936 16:12:56 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:27.936 16:12:56 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:27.936 16:12:56 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:28.196 16:12:56 -- setup/common.sh@18 -- # local node=0 00:04:28.196 16:12:56 -- setup/common.sh@19 -- # local var val 00:04:28.196 16:12:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:28.196 16:12:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.196 16:12:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:28.196 16:12:56 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:28.196 16:12:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.196 16:12:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.196 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.196 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.196 16:12:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23892028 kB' 'MemUsed: 8700056 kB' 'SwapCached: 16 kB' 'Active: 4523180 kB' 'Inactive: 338724 kB' 'Active(anon): 4304960 kB' 'Inactive(anon): 16 kB' 'Active(file): 218220 kB' 'Inactive(file): 338708 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4477740 kB' 'Mapped: 109952 kB' 'AnonPages: 387276 kB' 'Shmem: 3920796 kB' 'KernelStack: 12824 kB' 'PageTables: 4980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 349476 kB' 'Slab: 640984 kB' 'SReclaimable: 349476 kB' 'SUnreclaim: 291508 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:28.196 16:12:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.196 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.196 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.196 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.196 16:12:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.196 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # continue 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:28.197 16:12:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:28.197 16:12:56 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.197 16:12:56 -- setup/common.sh@33 -- # echo 0 00:04:28.197 16:12:56 -- setup/common.sh@33 -- # return 0 00:04:28.197 16:12:56 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:28.197 16:12:56 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:28.197 16:12:56 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:28.197 16:12:56 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:28.197 16:12:56 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:28.197 node0=1024 expecting 1024 00:04:28.197 16:12:56 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:28.197 00:04:28.197 real 0m6.828s 00:04:28.197 user 0m2.653s 00:04:28.197 sys 0m4.310s 00:04:28.197 16:12:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:28.197 16:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:28.197 ************************************ 00:04:28.197 END TEST no_shrink_alloc 00:04:28.197 ************************************ 00:04:28.197 16:12:56 -- setup/hugepages.sh@217 -- # clear_hp 00:04:28.197 16:12:56 -- setup/hugepages.sh@37 -- # local node hp 00:04:28.197 16:12:56 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:28.197 16:12:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:28.197 16:12:56 -- setup/hugepages.sh@41 -- # echo 0 00:04:28.197 16:12:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:28.197 16:12:56 -- setup/hugepages.sh@41 -- # echo 0 00:04:28.197 16:12:56 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:28.197 16:12:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:28.197 16:12:56 -- setup/hugepages.sh@41 -- # echo 0 00:04:28.197 16:12:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:28.197 16:12:56 -- setup/hugepages.sh@41 -- # echo 0 00:04:28.197 16:12:56 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:28.197 16:12:56 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:28.197 00:04:28.197 real 0m26.200s 00:04:28.197 user 0m9.177s 00:04:28.198 sys 0m15.821s 00:04:28.198 16:12:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:28.198 16:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:28.198 ************************************ 00:04:28.198 END TEST hugepages 00:04:28.198 ************************************ 00:04:28.198 16:12:56 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:28.198 16:12:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:28.198 16:12:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:28.198 16:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:28.198 ************************************ 00:04:28.198 START TEST driver 00:04:28.198 ************************************ 00:04:28.198 16:12:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:28.198 * Looking for test storage... 00:04:28.198 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:28.198 16:12:56 -- setup/driver.sh@68 -- # setup reset 00:04:28.198 16:12:56 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:28.198 16:12:56 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:33.476 16:13:01 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:33.476 16:13:01 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:33.476 16:13:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:33.476 16:13:01 -- common/autotest_common.sh@10 -- # set +x 00:04:33.476 ************************************ 00:04:33.476 START TEST guess_driver 00:04:33.476 ************************************ 00:04:33.476 16:13:01 -- common/autotest_common.sh@1104 -- # guess_driver 00:04:33.476 16:13:01 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:33.477 16:13:01 -- setup/driver.sh@47 -- # local fail=0 00:04:33.477 16:13:01 -- setup/driver.sh@49 -- # pick_driver 00:04:33.477 16:13:01 -- setup/driver.sh@36 -- # vfio 00:04:33.477 16:13:01 -- setup/driver.sh@21 -- # local iommu_grups 00:04:33.477 16:13:01 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:33.477 16:13:01 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:33.477 16:13:01 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:33.477 16:13:01 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:33.477 16:13:01 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:33.477 16:13:01 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:33.477 16:13:01 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:33.477 16:13:01 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:33.477 16:13:01 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:33.477 16:13:01 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:33.477 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:33.477 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:33.477 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:33.477 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:33.477 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:33.477 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:33.477 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:33.477 16:13:01 -- setup/driver.sh@30 -- # return 0 00:04:33.477 16:13:01 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:33.477 16:13:01 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:33.477 16:13:01 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:33.477 16:13:01 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:33.477 Looking for driver=vfio-pci 00:04:33.477 16:13:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.477 16:13:01 -- setup/driver.sh@45 -- # setup output config 00:04:33.477 16:13:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.477 16:13:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:36.010 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.010 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.010 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.010 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.010 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.010 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.268 16:13:04 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.268 16:13:04 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.268 16:13:04 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.170 16:13:06 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.170 16:13:06 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.170 16:13:06 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.170 16:13:06 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:38.170 16:13:06 -- setup/driver.sh@65 -- # setup reset 00:04:38.170 16:13:06 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:38.170 16:13:06 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:43.456 00:04:43.456 real 0m10.121s 00:04:43.456 user 0m2.728s 00:04:43.456 sys 0m5.130s 00:04:43.456 16:13:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.456 16:13:11 -- common/autotest_common.sh@10 -- # set +x 00:04:43.456 ************************************ 00:04:43.456 END TEST guess_driver 00:04:43.456 ************************************ 00:04:43.456 00:04:43.456 real 0m14.657s 00:04:43.456 user 0m3.864s 00:04:43.456 sys 0m7.714s 00:04:43.456 16:13:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.456 16:13:11 -- common/autotest_common.sh@10 -- # set +x 00:04:43.456 ************************************ 00:04:43.456 END TEST driver 00:04:43.456 ************************************ 00:04:43.456 16:13:11 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:43.456 16:13:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:43.456 16:13:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:43.456 16:13:11 -- common/autotest_common.sh@10 -- # set +x 00:04:43.456 ************************************ 00:04:43.456 START TEST devices 00:04:43.456 ************************************ 00:04:43.456 16:13:11 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:43.456 * Looking for test storage... 00:04:43.456 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:43.456 16:13:11 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:43.456 16:13:11 -- setup/devices.sh@192 -- # setup reset 00:04:43.456 16:13:11 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:43.456 16:13:11 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:46.738 16:13:15 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:46.738 16:13:15 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:46.738 16:13:15 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:46.738 16:13:15 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:46.738 16:13:15 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:46.738 16:13:15 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:46.738 16:13:15 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:46.738 16:13:15 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:46.738 16:13:15 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:46.738 16:13:15 -- setup/devices.sh@196 -- # blocks=() 00:04:46.738 16:13:15 -- setup/devices.sh@196 -- # declare -a blocks 00:04:46.738 16:13:15 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:46.738 16:13:15 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:46.738 16:13:15 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:46.738 16:13:15 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:46.738 16:13:15 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:46.738 16:13:15 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:46.738 16:13:15 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:46.738 16:13:15 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:46.738 16:13:15 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:46.738 16:13:15 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:46.738 16:13:15 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:46.738 No valid GPT data, bailing 00:04:46.738 16:13:15 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:46.738 16:13:15 -- scripts/common.sh@393 -- # pt= 00:04:46.738 16:13:15 -- scripts/common.sh@394 -- # return 1 00:04:46.738 16:13:15 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:46.738 16:13:15 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:46.738 16:13:15 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:46.738 16:13:15 -- setup/common.sh@80 -- # echo 1600321314816 00:04:46.738 16:13:15 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:46.738 16:13:15 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:46.738 16:13:15 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:46.738 16:13:15 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:46.738 16:13:15 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:46.738 16:13:15 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:46.738 16:13:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:46.738 16:13:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:46.738 16:13:15 -- common/autotest_common.sh@10 -- # set +x 00:04:46.738 ************************************ 00:04:46.738 START TEST nvme_mount 00:04:46.738 ************************************ 00:04:46.738 16:13:15 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:46.738 16:13:15 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:46.738 16:13:15 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:46.738 16:13:15 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.738 16:13:15 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.738 16:13:15 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:46.738 16:13:15 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:46.738 16:13:15 -- setup/common.sh@40 -- # local part_no=1 00:04:46.738 16:13:15 -- setup/common.sh@41 -- # local size=1073741824 00:04:46.738 16:13:15 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:46.738 16:13:15 -- setup/common.sh@44 -- # parts=() 00:04:46.738 16:13:15 -- setup/common.sh@44 -- # local parts 00:04:46.738 16:13:15 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:46.738 16:13:15 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:46.738 16:13:15 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:46.738 16:13:15 -- setup/common.sh@46 -- # (( part++ )) 00:04:46.738 16:13:15 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:46.738 16:13:15 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:46.738 16:13:15 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:46.738 16:13:15 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:47.673 Creating new GPT entries in memory. 00:04:47.673 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:47.673 other utilities. 00:04:47.673 16:13:16 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:47.673 16:13:16 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:47.673 16:13:16 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:47.673 16:13:16 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:47.673 16:13:16 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:49.049 Creating new GPT entries in memory. 00:04:49.049 The operation has completed successfully. 00:04:49.049 16:13:17 -- setup/common.sh@57 -- # (( part++ )) 00:04:49.049 16:13:17 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:49.049 16:13:17 -- setup/common.sh@62 -- # wait 2230159 00:04:49.049 16:13:17 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.049 16:13:17 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:49.049 16:13:17 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.049 16:13:17 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:49.049 16:13:17 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:49.049 16:13:17 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.049 16:13:17 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:49.049 16:13:17 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:49.049 16:13:17 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:49.049 16:13:17 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.049 16:13:17 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:49.049 16:13:17 -- setup/devices.sh@53 -- # local found=0 00:04:49.049 16:13:17 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:49.049 16:13:17 -- setup/devices.sh@56 -- # : 00:04:49.049 16:13:17 -- setup/devices.sh@59 -- # local pci status 00:04:49.049 16:13:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.049 16:13:17 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:49.049 16:13:17 -- setup/devices.sh@47 -- # setup output config 00:04:49.049 16:13:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.049 16:13:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:52.434 16:13:20 -- setup/devices.sh@63 -- # found=1 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:20 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:52.434 16:13:20 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:52.434 16:13:20 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.434 16:13:20 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:52.434 16:13:20 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:52.434 16:13:20 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:52.434 16:13:20 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.434 16:13:20 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.434 16:13:20 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:52.434 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:52.434 16:13:20 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:52.434 16:13:20 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:52.434 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:52.434 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:52.434 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:52.434 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:52.434 16:13:21 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:52.434 16:13:21 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:52.434 16:13:21 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.434 16:13:21 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:52.434 16:13:21 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:52.434 16:13:21 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.434 16:13:21 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:52.434 16:13:21 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:52.434 16:13:21 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:52.434 16:13:21 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.434 16:13:21 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:52.434 16:13:21 -- setup/devices.sh@53 -- # local found=0 00:04:52.434 16:13:21 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:52.434 16:13:21 -- setup/devices.sh@56 -- # : 00:04:52.434 16:13:21 -- setup/devices.sh@59 -- # local pci status 00:04:52.434 16:13:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.434 16:13:21 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:52.434 16:13:21 -- setup/devices.sh@47 -- # setup output config 00:04:52.434 16:13:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:52.434 16:13:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:55.722 16:13:24 -- setup/devices.sh@63 -- # found=1 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:55.722 16:13:24 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:55.722 16:13:24 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.722 16:13:24 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:55.722 16:13:24 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:55.722 16:13:24 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.722 16:13:24 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:55.722 16:13:24 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:55.722 16:13:24 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:55.722 16:13:24 -- setup/devices.sh@50 -- # local mount_point= 00:04:55.722 16:13:24 -- setup/devices.sh@51 -- # local test_file= 00:04:55.722 16:13:24 -- setup/devices.sh@53 -- # local found=0 00:04:55.722 16:13:24 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:55.722 16:13:24 -- setup/devices.sh@59 -- # local pci status 00:04:55.722 16:13:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.722 16:13:24 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:55.722 16:13:24 -- setup/devices.sh@47 -- # setup output config 00:04:55.722 16:13:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.722 16:13:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:59.005 16:13:27 -- setup/devices.sh@63 -- # found=1 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.005 16:13:27 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:59.005 16:13:27 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:59.005 16:13:27 -- setup/devices.sh@68 -- # return 0 00:04:59.005 16:13:27 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:59.005 16:13:27 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:59.005 16:13:27 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:59.005 16:13:27 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:59.005 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:59.005 00:04:59.005 real 0m12.205s 00:04:59.005 user 0m3.454s 00:04:59.005 sys 0m6.600s 00:04:59.005 16:13:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:59.005 16:13:27 -- common/autotest_common.sh@10 -- # set +x 00:04:59.005 ************************************ 00:04:59.005 END TEST nvme_mount 00:04:59.005 ************************************ 00:04:59.005 16:13:27 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:59.005 16:13:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:59.005 16:13:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:59.005 16:13:27 -- common/autotest_common.sh@10 -- # set +x 00:04:59.005 ************************************ 00:04:59.005 START TEST dm_mount 00:04:59.005 ************************************ 00:04:59.005 16:13:27 -- common/autotest_common.sh@1104 -- # dm_mount 00:04:59.005 16:13:27 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:59.005 16:13:27 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:59.005 16:13:27 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:59.005 16:13:27 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:59.005 16:13:27 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:59.005 16:13:27 -- setup/common.sh@40 -- # local part_no=2 00:04:59.005 16:13:27 -- setup/common.sh@41 -- # local size=1073741824 00:04:59.005 16:13:27 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:59.005 16:13:27 -- setup/common.sh@44 -- # parts=() 00:04:59.005 16:13:27 -- setup/common.sh@44 -- # local parts 00:04:59.005 16:13:27 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:59.005 16:13:27 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:59.005 16:13:27 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:59.005 16:13:27 -- setup/common.sh@46 -- # (( part++ )) 00:04:59.005 16:13:27 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:59.005 16:13:27 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:59.005 16:13:27 -- setup/common.sh@46 -- # (( part++ )) 00:04:59.005 16:13:27 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:59.005 16:13:27 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:59.005 16:13:27 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:59.005 16:13:27 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:59.939 Creating new GPT entries in memory. 00:04:59.939 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:59.939 other utilities. 00:04:59.939 16:13:28 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:59.939 16:13:28 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:59.939 16:13:28 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:59.939 16:13:28 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:59.939 16:13:28 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:01.310 Creating new GPT entries in memory. 00:05:01.310 The operation has completed successfully. 00:05:01.311 16:13:29 -- setup/common.sh@57 -- # (( part++ )) 00:05:01.311 16:13:29 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:01.311 16:13:29 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:01.311 16:13:29 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:01.311 16:13:29 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:02.244 The operation has completed successfully. 00:05:02.244 16:13:30 -- setup/common.sh@57 -- # (( part++ )) 00:05:02.244 16:13:30 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:02.244 16:13:30 -- setup/common.sh@62 -- # wait 2234665 00:05:02.244 16:13:30 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:02.244 16:13:30 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:02.244 16:13:30 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:02.244 16:13:30 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:02.244 16:13:30 -- setup/devices.sh@160 -- # for t in {1..5} 00:05:02.244 16:13:30 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:02.244 16:13:30 -- setup/devices.sh@161 -- # break 00:05:02.244 16:13:30 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:02.244 16:13:30 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:02.244 16:13:30 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:02.244 16:13:30 -- setup/devices.sh@166 -- # dm=dm-0 00:05:02.244 16:13:30 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:02.245 16:13:30 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:02.245 16:13:30 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:02.245 16:13:30 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:02.245 16:13:30 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:02.245 16:13:30 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:02.245 16:13:30 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:02.245 16:13:30 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:02.245 16:13:30 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:02.245 16:13:30 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:02.245 16:13:30 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:02.245 16:13:30 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:02.245 16:13:30 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:02.245 16:13:30 -- setup/devices.sh@53 -- # local found=0 00:05:02.245 16:13:30 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:02.245 16:13:30 -- setup/devices.sh@56 -- # : 00:05:02.245 16:13:30 -- setup/devices.sh@59 -- # local pci status 00:05:02.245 16:13:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.245 16:13:30 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:02.245 16:13:30 -- setup/devices.sh@47 -- # setup output config 00:05:02.245 16:13:30 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:02.245 16:13:30 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:05.523 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:05.524 16:13:33 -- setup/devices.sh@63 -- # found=1 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.524 16:13:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:34 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:05.524 16:13:34 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:05.524 16:13:34 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:05.524 16:13:34 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:05.524 16:13:34 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:05.524 16:13:34 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:05.524 16:13:34 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:05.524 16:13:34 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:05.524 16:13:34 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:05.524 16:13:34 -- setup/devices.sh@50 -- # local mount_point= 00:05:05.524 16:13:34 -- setup/devices.sh@51 -- # local test_file= 00:05:05.524 16:13:34 -- setup/devices.sh@53 -- # local found=0 00:05:05.524 16:13:34 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:05.524 16:13:34 -- setup/devices.sh@59 -- # local pci status 00:05:05.524 16:13:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.524 16:13:34 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:05.524 16:13:34 -- setup/devices.sh@47 -- # setup output config 00:05:05.524 16:13:34 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:05.524 16:13:34 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:08.800 16:13:37 -- setup/devices.sh@63 -- # found=1 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.800 16:13:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.800 16:13:37 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:08.800 16:13:37 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:08.800 16:13:37 -- setup/devices.sh@68 -- # return 0 00:05:08.800 16:13:37 -- setup/devices.sh@187 -- # cleanup_dm 00:05:08.800 16:13:37 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:09.059 16:13:37 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:09.059 16:13:37 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:09.059 16:13:37 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:09.059 16:13:37 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:09.059 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:09.059 16:13:37 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:09.059 16:13:37 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:09.059 00:05:09.059 real 0m9.996s 00:05:09.059 user 0m2.430s 00:05:09.059 sys 0m4.655s 00:05:09.059 16:13:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.059 16:13:37 -- common/autotest_common.sh@10 -- # set +x 00:05:09.059 ************************************ 00:05:09.059 END TEST dm_mount 00:05:09.059 ************************************ 00:05:09.059 16:13:37 -- setup/devices.sh@1 -- # cleanup 00:05:09.059 16:13:37 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:09.059 16:13:37 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.059 16:13:37 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:09.059 16:13:37 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:09.059 16:13:37 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:09.059 16:13:37 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:09.317 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:09.317 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:09.317 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:09.317 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:09.317 16:13:38 -- setup/devices.sh@12 -- # cleanup_dm 00:05:09.317 16:13:38 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:09.317 16:13:38 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:09.317 16:13:38 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:09.317 16:13:38 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:09.317 16:13:38 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:09.317 16:13:38 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:09.317 00:05:09.317 real 0m26.483s 00:05:09.317 user 0m7.327s 00:05:09.317 sys 0m14.017s 00:05:09.317 16:13:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.317 16:13:38 -- common/autotest_common.sh@10 -- # set +x 00:05:09.317 ************************************ 00:05:09.317 END TEST devices 00:05:09.317 ************************************ 00:05:09.317 00:05:09.317 real 1m31.543s 00:05:09.317 user 0m28.054s 00:05:09.317 sys 0m52.313s 00:05:09.317 16:13:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.317 16:13:38 -- common/autotest_common.sh@10 -- # set +x 00:05:09.317 ************************************ 00:05:09.317 END TEST setup.sh 00:05:09.317 ************************************ 00:05:09.317 16:13:38 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:12.598 Hugepages 00:05:12.598 node hugesize free / total 00:05:12.598 node0 1048576kB 0 / 0 00:05:12.598 node0 2048kB 2048 / 2048 00:05:12.598 node1 1048576kB 0 / 0 00:05:12.598 node1 2048kB 0 / 0 00:05:12.598 00:05:12.598 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:12.598 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:12.598 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:12.598 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:12.598 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:12.598 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:12.598 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:12.598 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:12.598 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:12.598 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:12.598 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:12.598 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:12.598 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:12.598 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:12.598 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:12.598 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:12.598 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:12.598 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:12.598 16:13:41 -- spdk/autotest.sh@141 -- # uname -s 00:05:12.598 16:13:41 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:05:12.598 16:13:41 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:05:12.598 16:13:41 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:15.874 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:15.874 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:16.132 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:17.503 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:17.503 16:13:46 -- common/autotest_common.sh@1517 -- # sleep 1 00:05:18.886 16:13:47 -- common/autotest_common.sh@1518 -- # bdfs=() 00:05:18.886 16:13:47 -- common/autotest_common.sh@1518 -- # local bdfs 00:05:18.886 16:13:47 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:05:18.886 16:13:47 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:05:18.886 16:13:47 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:18.886 16:13:47 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:18.886 16:13:47 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:18.886 16:13:47 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:18.886 16:13:47 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:18.886 16:13:47 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:18.886 16:13:47 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:18.886 16:13:47 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:22.170 Waiting for block devices as requested 00:05:22.170 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:22.170 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:22.170 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:22.170 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:22.170 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:22.170 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:22.170 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:22.170 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:22.428 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:22.428 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:22.428 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:22.687 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:22.687 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:22.687 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:22.945 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:22.945 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:22.945 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:23.204 16:13:51 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:05:23.204 16:13:51 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:23.204 16:13:51 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:05:23.204 16:13:51 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:05:23.204 16:13:51 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:23.204 16:13:51 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:23.204 16:13:51 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:23.204 16:13:51 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:05:23.204 16:13:51 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:05:23.204 16:13:51 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:05:23.204 16:13:51 -- common/autotest_common.sh@1530 -- # grep oacs 00:05:23.204 16:13:51 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:05:23.204 16:13:51 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:05:23.204 16:13:51 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:05:23.204 16:13:51 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:05:23.204 16:13:51 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:05:23.204 16:13:51 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:05:23.204 16:13:51 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:05:23.204 16:13:51 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:05:23.204 16:13:51 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:05:23.204 16:13:51 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:05:23.204 16:13:51 -- common/autotest_common.sh@1542 -- # continue 00:05:23.204 16:13:51 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:05:23.204 16:13:51 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:23.204 16:13:51 -- common/autotest_common.sh@10 -- # set +x 00:05:23.204 16:13:51 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:05:23.204 16:13:51 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:23.204 16:13:51 -- common/autotest_common.sh@10 -- # set +x 00:05:23.204 16:13:51 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:26.538 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:26.538 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:26.539 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:26.539 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:26.796 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:28.699 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:28.699 16:13:57 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:05:28.699 16:13:57 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:28.699 16:13:57 -- common/autotest_common.sh@10 -- # set +x 00:05:28.699 16:13:57 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:05:28.699 16:13:57 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:05:28.699 16:13:57 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:05:28.699 16:13:57 -- common/autotest_common.sh@1562 -- # bdfs=() 00:05:28.699 16:13:57 -- common/autotest_common.sh@1562 -- # local bdfs 00:05:28.699 16:13:57 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:05:28.699 16:13:57 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:28.699 16:13:57 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:28.699 16:13:57 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:28.699 16:13:57 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:28.699 16:13:57 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:28.699 16:13:57 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:28.699 16:13:57 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:28.699 16:13:57 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:05:28.699 16:13:57 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:28.699 16:13:57 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:05:28.699 16:13:57 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:28.699 16:13:57 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:05:28.699 16:13:57 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:05:28.699 16:13:57 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:05:28.699 16:13:57 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=2244610 00:05:28.699 16:13:57 -- common/autotest_common.sh@1583 -- # waitforlisten 2244610 00:05:28.699 16:13:57 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:28.699 16:13:57 -- common/autotest_common.sh@819 -- # '[' -z 2244610 ']' 00:05:28.699 16:13:57 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:28.699 16:13:57 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:28.699 16:13:57 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:28.699 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:28.699 16:13:57 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:28.699 16:13:57 -- common/autotest_common.sh@10 -- # set +x 00:05:28.699 [2024-07-20 16:13:57.370713] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:28.699 [2024-07-20 16:13:57.370781] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2244610 ] 00:05:28.699 EAL: No free 2048 kB hugepages reported on node 1 00:05:28.699 [2024-07-20 16:13:57.440496] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:28.699 [2024-07-20 16:13:57.480571] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:28.699 [2024-07-20 16:13:57.480703] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.634 16:13:58 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:29.634 16:13:58 -- common/autotest_common.sh@852 -- # return 0 00:05:29.634 16:13:58 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:05:29.634 16:13:58 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:05:29.635 16:13:58 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:32.919 nvme0n1 00:05:32.919 16:14:01 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:32.919 [2024-07-20 16:14:01.324456] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:32.919 request: 00:05:32.919 { 00:05:32.919 "nvme_ctrlr_name": "nvme0", 00:05:32.919 "password": "test", 00:05:32.919 "method": "bdev_nvme_opal_revert", 00:05:32.919 "req_id": 1 00:05:32.919 } 00:05:32.919 Got JSON-RPC error response 00:05:32.919 response: 00:05:32.919 { 00:05:32.919 "code": -32602, 00:05:32.919 "message": "Invalid parameters" 00:05:32.919 } 00:05:32.919 16:14:01 -- common/autotest_common.sh@1589 -- # true 00:05:32.919 16:14:01 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:05:32.919 16:14:01 -- common/autotest_common.sh@1593 -- # killprocess 2244610 00:05:32.919 16:14:01 -- common/autotest_common.sh@926 -- # '[' -z 2244610 ']' 00:05:32.919 16:14:01 -- common/autotest_common.sh@930 -- # kill -0 2244610 00:05:32.919 16:14:01 -- common/autotest_common.sh@931 -- # uname 00:05:32.919 16:14:01 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:32.919 16:14:01 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2244610 00:05:32.919 16:14:01 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:32.919 16:14:01 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:32.919 16:14:01 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2244610' 00:05:32.919 killing process with pid 2244610 00:05:32.919 16:14:01 -- common/autotest_common.sh@945 -- # kill 2244610 00:05:32.919 16:14:01 -- common/autotest_common.sh@950 -- # wait 2244610 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.919 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:32.920 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.819 16:14:03 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:05:34.819 16:14:03 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:05:34.819 16:14:03 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:34.819 16:14:03 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:34.819 16:14:03 -- spdk/autotest.sh@173 -- # timing_enter lib 00:05:34.819 16:14:03 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:34.819 16:14:03 -- common/autotest_common.sh@10 -- # set +x 00:05:34.819 16:14:03 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:34.819 16:14:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:34.819 16:14:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:34.819 16:14:03 -- common/autotest_common.sh@10 -- # set +x 00:05:34.819 ************************************ 00:05:34.819 START TEST env 00:05:34.819 ************************************ 00:05:34.819 16:14:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:34.819 * Looking for test storage... 00:05:34.819 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:34.819 16:14:03 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:34.819 16:14:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:34.819 16:14:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:34.819 16:14:03 -- common/autotest_common.sh@10 -- # set +x 00:05:35.078 ************************************ 00:05:35.078 START TEST env_memory 00:05:35.078 ************************************ 00:05:35.078 16:14:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:35.078 00:05:35.078 00:05:35.078 CUnit - A unit testing framework for C - Version 2.1-3 00:05:35.078 http://cunit.sourceforge.net/ 00:05:35.078 00:05:35.078 00:05:35.078 Suite: memory 00:05:35.078 Test: alloc and free memory map ...[2024-07-20 16:14:03.659777] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:35.078 passed 00:05:35.078 Test: mem map translation ...[2024-07-20 16:14:03.672826] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:35.078 [2024-07-20 16:14:03.672845] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:35.078 [2024-07-20 16:14:03.672876] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:35.078 [2024-07-20 16:14:03.672885] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:35.078 passed 00:05:35.078 Test: mem map registration ...[2024-07-20 16:14:03.694774] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:35.078 [2024-07-20 16:14:03.694791] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:35.078 passed 00:05:35.078 Test: mem map adjacent registrations ...passed 00:05:35.078 00:05:35.078 Run Summary: Type Total Ran Passed Failed Inactive 00:05:35.078 suites 1 1 n/a 0 0 00:05:35.078 tests 4 4 4 0 0 00:05:35.078 asserts 152 152 152 0 n/a 00:05:35.078 00:05:35.078 Elapsed time = 0.088 seconds 00:05:35.078 00:05:35.078 real 0m0.101s 00:05:35.078 user 0m0.093s 00:05:35.078 sys 0m0.008s 00:05:35.078 16:14:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.078 16:14:03 -- common/autotest_common.sh@10 -- # set +x 00:05:35.078 ************************************ 00:05:35.078 END TEST env_memory 00:05:35.078 ************************************ 00:05:35.078 16:14:03 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:35.078 16:14:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.078 16:14:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.078 16:14:03 -- common/autotest_common.sh@10 -- # set +x 00:05:35.078 ************************************ 00:05:35.078 START TEST env_vtophys 00:05:35.078 ************************************ 00:05:35.079 16:14:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:35.079 EAL: lib.eal log level changed from notice to debug 00:05:35.079 EAL: Detected lcore 0 as core 0 on socket 0 00:05:35.079 EAL: Detected lcore 1 as core 1 on socket 0 00:05:35.079 EAL: Detected lcore 2 as core 2 on socket 0 00:05:35.079 EAL: Detected lcore 3 as core 3 on socket 0 00:05:35.079 EAL: Detected lcore 4 as core 4 on socket 0 00:05:35.079 EAL: Detected lcore 5 as core 5 on socket 0 00:05:35.079 EAL: Detected lcore 6 as core 6 on socket 0 00:05:35.079 EAL: Detected lcore 7 as core 8 on socket 0 00:05:35.079 EAL: Detected lcore 8 as core 9 on socket 0 00:05:35.079 EAL: Detected lcore 9 as core 10 on socket 0 00:05:35.079 EAL: Detected lcore 10 as core 11 on socket 0 00:05:35.079 EAL: Detected lcore 11 as core 12 on socket 0 00:05:35.079 EAL: Detected lcore 12 as core 13 on socket 0 00:05:35.079 EAL: Detected lcore 13 as core 14 on socket 0 00:05:35.079 EAL: Detected lcore 14 as core 16 on socket 0 00:05:35.079 EAL: Detected lcore 15 as core 17 on socket 0 00:05:35.079 EAL: Detected lcore 16 as core 18 on socket 0 00:05:35.079 EAL: Detected lcore 17 as core 19 on socket 0 00:05:35.079 EAL: Detected lcore 18 as core 20 on socket 0 00:05:35.079 EAL: Detected lcore 19 as core 21 on socket 0 00:05:35.079 EAL: Detected lcore 20 as core 22 on socket 0 00:05:35.079 EAL: Detected lcore 21 as core 24 on socket 0 00:05:35.079 EAL: Detected lcore 22 as core 25 on socket 0 00:05:35.079 EAL: Detected lcore 23 as core 26 on socket 0 00:05:35.079 EAL: Detected lcore 24 as core 27 on socket 0 00:05:35.079 EAL: Detected lcore 25 as core 28 on socket 0 00:05:35.079 EAL: Detected lcore 26 as core 29 on socket 0 00:05:35.079 EAL: Detected lcore 27 as core 30 on socket 0 00:05:35.079 EAL: Detected lcore 28 as core 0 on socket 1 00:05:35.079 EAL: Detected lcore 29 as core 1 on socket 1 00:05:35.079 EAL: Detected lcore 30 as core 2 on socket 1 00:05:35.079 EAL: Detected lcore 31 as core 3 on socket 1 00:05:35.079 EAL: Detected lcore 32 as core 4 on socket 1 00:05:35.079 EAL: Detected lcore 33 as core 5 on socket 1 00:05:35.079 EAL: Detected lcore 34 as core 6 on socket 1 00:05:35.079 EAL: Detected lcore 35 as core 8 on socket 1 00:05:35.079 EAL: Detected lcore 36 as core 9 on socket 1 00:05:35.079 EAL: Detected lcore 37 as core 10 on socket 1 00:05:35.079 EAL: Detected lcore 38 as core 11 on socket 1 00:05:35.079 EAL: Detected lcore 39 as core 12 on socket 1 00:05:35.079 EAL: Detected lcore 40 as core 13 on socket 1 00:05:35.079 EAL: Detected lcore 41 as core 14 on socket 1 00:05:35.079 EAL: Detected lcore 42 as core 16 on socket 1 00:05:35.079 EAL: Detected lcore 43 as core 17 on socket 1 00:05:35.079 EAL: Detected lcore 44 as core 18 on socket 1 00:05:35.079 EAL: Detected lcore 45 as core 19 on socket 1 00:05:35.079 EAL: Detected lcore 46 as core 20 on socket 1 00:05:35.079 EAL: Detected lcore 47 as core 21 on socket 1 00:05:35.079 EAL: Detected lcore 48 as core 22 on socket 1 00:05:35.079 EAL: Detected lcore 49 as core 24 on socket 1 00:05:35.079 EAL: Detected lcore 50 as core 25 on socket 1 00:05:35.079 EAL: Detected lcore 51 as core 26 on socket 1 00:05:35.079 EAL: Detected lcore 52 as core 27 on socket 1 00:05:35.079 EAL: Detected lcore 53 as core 28 on socket 1 00:05:35.079 EAL: Detected lcore 54 as core 29 on socket 1 00:05:35.079 EAL: Detected lcore 55 as core 30 on socket 1 00:05:35.079 EAL: Detected lcore 56 as core 0 on socket 0 00:05:35.079 EAL: Detected lcore 57 as core 1 on socket 0 00:05:35.079 EAL: Detected lcore 58 as core 2 on socket 0 00:05:35.079 EAL: Detected lcore 59 as core 3 on socket 0 00:05:35.079 EAL: Detected lcore 60 as core 4 on socket 0 00:05:35.079 EAL: Detected lcore 61 as core 5 on socket 0 00:05:35.079 EAL: Detected lcore 62 as core 6 on socket 0 00:05:35.079 EAL: Detected lcore 63 as core 8 on socket 0 00:05:35.079 EAL: Detected lcore 64 as core 9 on socket 0 00:05:35.079 EAL: Detected lcore 65 as core 10 on socket 0 00:05:35.079 EAL: Detected lcore 66 as core 11 on socket 0 00:05:35.079 EAL: Detected lcore 67 as core 12 on socket 0 00:05:35.079 EAL: Detected lcore 68 as core 13 on socket 0 00:05:35.079 EAL: Detected lcore 69 as core 14 on socket 0 00:05:35.079 EAL: Detected lcore 70 as core 16 on socket 0 00:05:35.079 EAL: Detected lcore 71 as core 17 on socket 0 00:05:35.079 EAL: Detected lcore 72 as core 18 on socket 0 00:05:35.079 EAL: Detected lcore 73 as core 19 on socket 0 00:05:35.079 EAL: Detected lcore 74 as core 20 on socket 0 00:05:35.079 EAL: Detected lcore 75 as core 21 on socket 0 00:05:35.079 EAL: Detected lcore 76 as core 22 on socket 0 00:05:35.079 EAL: Detected lcore 77 as core 24 on socket 0 00:05:35.079 EAL: Detected lcore 78 as core 25 on socket 0 00:05:35.079 EAL: Detected lcore 79 as core 26 on socket 0 00:05:35.079 EAL: Detected lcore 80 as core 27 on socket 0 00:05:35.079 EAL: Detected lcore 81 as core 28 on socket 0 00:05:35.079 EAL: Detected lcore 82 as core 29 on socket 0 00:05:35.079 EAL: Detected lcore 83 as core 30 on socket 0 00:05:35.079 EAL: Detected lcore 84 as core 0 on socket 1 00:05:35.079 EAL: Detected lcore 85 as core 1 on socket 1 00:05:35.079 EAL: Detected lcore 86 as core 2 on socket 1 00:05:35.079 EAL: Detected lcore 87 as core 3 on socket 1 00:05:35.079 EAL: Detected lcore 88 as core 4 on socket 1 00:05:35.079 EAL: Detected lcore 89 as core 5 on socket 1 00:05:35.079 EAL: Detected lcore 90 as core 6 on socket 1 00:05:35.079 EAL: Detected lcore 91 as core 8 on socket 1 00:05:35.079 EAL: Detected lcore 92 as core 9 on socket 1 00:05:35.079 EAL: Detected lcore 93 as core 10 on socket 1 00:05:35.079 EAL: Detected lcore 94 as core 11 on socket 1 00:05:35.079 EAL: Detected lcore 95 as core 12 on socket 1 00:05:35.079 EAL: Detected lcore 96 as core 13 on socket 1 00:05:35.079 EAL: Detected lcore 97 as core 14 on socket 1 00:05:35.079 EAL: Detected lcore 98 as core 16 on socket 1 00:05:35.079 EAL: Detected lcore 99 as core 17 on socket 1 00:05:35.079 EAL: Detected lcore 100 as core 18 on socket 1 00:05:35.079 EAL: Detected lcore 101 as core 19 on socket 1 00:05:35.079 EAL: Detected lcore 102 as core 20 on socket 1 00:05:35.079 EAL: Detected lcore 103 as core 21 on socket 1 00:05:35.079 EAL: Detected lcore 104 as core 22 on socket 1 00:05:35.079 EAL: Detected lcore 105 as core 24 on socket 1 00:05:35.079 EAL: Detected lcore 106 as core 25 on socket 1 00:05:35.079 EAL: Detected lcore 107 as core 26 on socket 1 00:05:35.079 EAL: Detected lcore 108 as core 27 on socket 1 00:05:35.079 EAL: Detected lcore 109 as core 28 on socket 1 00:05:35.079 EAL: Detected lcore 110 as core 29 on socket 1 00:05:35.079 EAL: Detected lcore 111 as core 30 on socket 1 00:05:35.079 EAL: Maximum logical cores by configuration: 128 00:05:35.079 EAL: Detected CPU lcores: 112 00:05:35.079 EAL: Detected NUMA nodes: 2 00:05:35.079 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:05:35.079 EAL: Checking presence of .so 'librte_eal.so.23' 00:05:35.079 EAL: Checking presence of .so 'librte_eal.so' 00:05:35.079 EAL: Detected static linkage of DPDK 00:05:35.079 EAL: No shared files mode enabled, IPC will be disabled 00:05:35.079 EAL: Bus pci wants IOVA as 'DC' 00:05:35.079 EAL: Buses did not request a specific IOVA mode. 00:05:35.079 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:35.079 EAL: Selected IOVA mode 'VA' 00:05:35.079 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.079 EAL: Probing VFIO support... 00:05:35.079 EAL: IOMMU type 1 (Type 1) is supported 00:05:35.079 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:35.079 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:35.079 EAL: VFIO support initialized 00:05:35.079 EAL: Ask a virtual area of 0x2e000 bytes 00:05:35.079 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:35.079 EAL: Setting up physically contiguous memory... 00:05:35.079 EAL: Setting maximum number of open files to 524288 00:05:35.079 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:35.079 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:35.079 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:35.079 EAL: Ask a virtual area of 0x61000 bytes 00:05:35.079 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:35.079 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:35.079 EAL: Ask a virtual area of 0x400000000 bytes 00:05:35.079 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:35.079 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:35.079 EAL: Ask a virtual area of 0x61000 bytes 00:05:35.079 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:35.079 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:35.079 EAL: Ask a virtual area of 0x400000000 bytes 00:05:35.079 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:35.079 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:35.079 EAL: Ask a virtual area of 0x61000 bytes 00:05:35.079 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:35.079 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:35.079 EAL: Ask a virtual area of 0x400000000 bytes 00:05:35.079 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:35.079 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:35.079 EAL: Ask a virtual area of 0x61000 bytes 00:05:35.079 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:35.079 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:35.079 EAL: Ask a virtual area of 0x400000000 bytes 00:05:35.079 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:35.079 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:35.079 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:35.079 EAL: Ask a virtual area of 0x61000 bytes 00:05:35.079 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:35.079 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:35.079 EAL: Ask a virtual area of 0x400000000 bytes 00:05:35.079 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:35.079 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:35.079 EAL: Ask a virtual area of 0x61000 bytes 00:05:35.079 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:35.079 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:35.079 EAL: Ask a virtual area of 0x400000000 bytes 00:05:35.079 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:35.079 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:35.079 EAL: Ask a virtual area of 0x61000 bytes 00:05:35.079 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:35.079 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:35.079 EAL: Ask a virtual area of 0x400000000 bytes 00:05:35.079 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:35.079 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:35.079 EAL: Ask a virtual area of 0x61000 bytes 00:05:35.079 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:35.079 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:35.079 EAL: Ask a virtual area of 0x400000000 bytes 00:05:35.079 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:35.079 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:35.079 EAL: Hugepages will be freed exactly as allocated. 00:05:35.079 EAL: No shared files mode enabled, IPC is disabled 00:05:35.079 EAL: No shared files mode enabled, IPC is disabled 00:05:35.079 EAL: TSC frequency is ~2500000 KHz 00:05:35.079 EAL: Main lcore 0 is ready (tid=7f8860143a00;cpuset=[0]) 00:05:35.079 EAL: Trying to obtain current memory policy. 00:05:35.080 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.080 EAL: Restoring previous memory policy: 0 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was expanded by 2MB 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Mem event callback 'spdk:(nil)' registered 00:05:35.080 00:05:35.080 00:05:35.080 CUnit - A unit testing framework for C - Version 2.1-3 00:05:35.080 http://cunit.sourceforge.net/ 00:05:35.080 00:05:35.080 00:05:35.080 Suite: components_suite 00:05:35.080 Test: vtophys_malloc_test ...passed 00:05:35.080 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:35.080 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.080 EAL: Restoring previous memory policy: 4 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was expanded by 4MB 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was shrunk by 4MB 00:05:35.080 EAL: Trying to obtain current memory policy. 00:05:35.080 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.080 EAL: Restoring previous memory policy: 4 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was expanded by 6MB 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was shrunk by 6MB 00:05:35.080 EAL: Trying to obtain current memory policy. 00:05:35.080 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.080 EAL: Restoring previous memory policy: 4 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was expanded by 10MB 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was shrunk by 10MB 00:05:35.080 EAL: Trying to obtain current memory policy. 00:05:35.080 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.080 EAL: Restoring previous memory policy: 4 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was expanded by 18MB 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was shrunk by 18MB 00:05:35.080 EAL: Trying to obtain current memory policy. 00:05:35.080 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.080 EAL: Restoring previous memory policy: 4 00:05:35.080 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.080 EAL: request: mp_malloc_sync 00:05:35.080 EAL: No shared files mode enabled, IPC is disabled 00:05:35.080 EAL: Heap on socket 0 was expanded by 34MB 00:05:35.338 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.339 EAL: request: mp_malloc_sync 00:05:35.339 EAL: No shared files mode enabled, IPC is disabled 00:05:35.339 EAL: Heap on socket 0 was shrunk by 34MB 00:05:35.339 EAL: Trying to obtain current memory policy. 00:05:35.339 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.339 EAL: Restoring previous memory policy: 4 00:05:35.339 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.339 EAL: request: mp_malloc_sync 00:05:35.339 EAL: No shared files mode enabled, IPC is disabled 00:05:35.339 EAL: Heap on socket 0 was expanded by 66MB 00:05:35.339 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.339 EAL: request: mp_malloc_sync 00:05:35.339 EAL: No shared files mode enabled, IPC is disabled 00:05:35.339 EAL: Heap on socket 0 was shrunk by 66MB 00:05:35.339 EAL: Trying to obtain current memory policy. 00:05:35.339 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.339 EAL: Restoring previous memory policy: 4 00:05:35.339 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.339 EAL: request: mp_malloc_sync 00:05:35.339 EAL: No shared files mode enabled, IPC is disabled 00:05:35.339 EAL: Heap on socket 0 was expanded by 130MB 00:05:35.339 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.339 EAL: request: mp_malloc_sync 00:05:35.339 EAL: No shared files mode enabled, IPC is disabled 00:05:35.339 EAL: Heap on socket 0 was shrunk by 130MB 00:05:35.339 EAL: Trying to obtain current memory policy. 00:05:35.339 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.339 EAL: Restoring previous memory policy: 4 00:05:35.339 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.339 EAL: request: mp_malloc_sync 00:05:35.339 EAL: No shared files mode enabled, IPC is disabled 00:05:35.339 EAL: Heap on socket 0 was expanded by 258MB 00:05:35.339 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.339 EAL: request: mp_malloc_sync 00:05:35.339 EAL: No shared files mode enabled, IPC is disabled 00:05:35.339 EAL: Heap on socket 0 was shrunk by 258MB 00:05:35.339 EAL: Trying to obtain current memory policy. 00:05:35.339 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.597 EAL: Restoring previous memory policy: 4 00:05:35.597 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.597 EAL: request: mp_malloc_sync 00:05:35.597 EAL: No shared files mode enabled, IPC is disabled 00:05:35.597 EAL: Heap on socket 0 was expanded by 514MB 00:05:35.597 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.597 EAL: request: mp_malloc_sync 00:05:35.597 EAL: No shared files mode enabled, IPC is disabled 00:05:35.597 EAL: Heap on socket 0 was shrunk by 514MB 00:05:35.597 EAL: Trying to obtain current memory policy. 00:05:35.597 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:35.855 EAL: Restoring previous memory policy: 4 00:05:35.855 EAL: Calling mem event callback 'spdk:(nil)' 00:05:35.855 EAL: request: mp_malloc_sync 00:05:35.855 EAL: No shared files mode enabled, IPC is disabled 00:05:35.855 EAL: Heap on socket 0 was expanded by 1026MB 00:05:36.114 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.114 EAL: request: mp_malloc_sync 00:05:36.114 EAL: No shared files mode enabled, IPC is disabled 00:05:36.114 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:36.114 passed 00:05:36.114 00:05:36.114 Run Summary: Type Total Ran Passed Failed Inactive 00:05:36.114 suites 1 1 n/a 0 0 00:05:36.114 tests 2 2 2 0 0 00:05:36.114 asserts 497 497 497 0 n/a 00:05:36.114 00:05:36.114 Elapsed time = 0.963 seconds 00:05:36.114 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.114 EAL: request: mp_malloc_sync 00:05:36.114 EAL: No shared files mode enabled, IPC is disabled 00:05:36.114 EAL: Heap on socket 0 was shrunk by 2MB 00:05:36.114 EAL: No shared files mode enabled, IPC is disabled 00:05:36.114 EAL: No shared files mode enabled, IPC is disabled 00:05:36.114 EAL: No shared files mode enabled, IPC is disabled 00:05:36.114 00:05:36.114 real 0m1.090s 00:05:36.114 user 0m0.619s 00:05:36.114 sys 0m0.438s 00:05:36.114 16:14:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.114 16:14:04 -- common/autotest_common.sh@10 -- # set +x 00:05:36.114 ************************************ 00:05:36.114 END TEST env_vtophys 00:05:36.114 ************************************ 00:05:36.114 16:14:04 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:36.114 16:14:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.114 16:14:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.114 16:14:04 -- common/autotest_common.sh@10 -- # set +x 00:05:36.114 ************************************ 00:05:36.114 START TEST env_pci 00:05:36.114 ************************************ 00:05:36.114 16:14:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:36.373 00:05:36.373 00:05:36.373 CUnit - A unit testing framework for C - Version 2.1-3 00:05:36.373 http://cunit.sourceforge.net/ 00:05:36.373 00:05:36.373 00:05:36.373 Suite: pci 00:05:36.373 Test: pci_hook ...[2024-07-20 16:14:04.923355] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2245994 has claimed it 00:05:36.373 EAL: Cannot find device (10000:00:01.0) 00:05:36.373 EAL: Failed to attach device on primary process 00:05:36.373 passed 00:05:36.373 00:05:36.373 Run Summary: Type Total Ran Passed Failed Inactive 00:05:36.373 suites 1 1 n/a 0 0 00:05:36.373 tests 1 1 1 0 0 00:05:36.373 asserts 25 25 25 0 n/a 00:05:36.373 00:05:36.373 Elapsed time = 0.034 seconds 00:05:36.373 00:05:36.373 real 0m0.052s 00:05:36.373 user 0m0.011s 00:05:36.373 sys 0m0.041s 00:05:36.373 16:14:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.373 16:14:04 -- common/autotest_common.sh@10 -- # set +x 00:05:36.373 ************************************ 00:05:36.373 END TEST env_pci 00:05:36.373 ************************************ 00:05:36.373 16:14:04 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:36.373 16:14:04 -- env/env.sh@15 -- # uname 00:05:36.373 16:14:05 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:36.373 16:14:05 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:36.373 16:14:05 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:36.373 16:14:05 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:05:36.373 16:14:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.373 16:14:05 -- common/autotest_common.sh@10 -- # set +x 00:05:36.373 ************************************ 00:05:36.373 START TEST env_dpdk_post_init 00:05:36.373 ************************************ 00:05:36.373 16:14:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:36.373 EAL: Detected CPU lcores: 112 00:05:36.373 EAL: Detected NUMA nodes: 2 00:05:36.373 EAL: Detected static linkage of DPDK 00:05:36.373 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:36.373 EAL: Selected IOVA mode 'VA' 00:05:36.373 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.373 EAL: VFIO support initialized 00:05:36.373 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:36.373 EAL: Using IOMMU type 1 (Type 1) 00:05:37.308 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:41.489 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:41.489 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:41.489 Starting DPDK initialization... 00:05:41.489 Starting SPDK post initialization... 00:05:41.489 SPDK NVMe probe 00:05:41.489 Attaching to 0000:d8:00.0 00:05:41.489 Attached to 0000:d8:00.0 00:05:41.489 Cleaning up... 00:05:41.489 00:05:41.489 real 0m4.750s 00:05:41.489 user 0m3.579s 00:05:41.489 sys 0m0.415s 00:05:41.489 16:14:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:41.489 16:14:09 -- common/autotest_common.sh@10 -- # set +x 00:05:41.489 ************************************ 00:05:41.489 END TEST env_dpdk_post_init 00:05:41.489 ************************************ 00:05:41.489 16:14:09 -- env/env.sh@26 -- # uname 00:05:41.489 16:14:09 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:41.489 16:14:09 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:41.489 16:14:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:41.489 16:14:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:41.489 16:14:09 -- common/autotest_common.sh@10 -- # set +x 00:05:41.489 ************************************ 00:05:41.489 START TEST env_mem_callbacks 00:05:41.489 ************************************ 00:05:41.489 16:14:09 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:41.489 EAL: Detected CPU lcores: 112 00:05:41.489 EAL: Detected NUMA nodes: 2 00:05:41.489 EAL: Detected static linkage of DPDK 00:05:41.489 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:41.489 EAL: Selected IOVA mode 'VA' 00:05:41.489 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.489 EAL: VFIO support initialized 00:05:41.489 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:41.489 00:05:41.489 00:05:41.489 CUnit - A unit testing framework for C - Version 2.1-3 00:05:41.489 http://cunit.sourceforge.net/ 00:05:41.489 00:05:41.489 00:05:41.489 Suite: memory 00:05:41.489 Test: test ... 00:05:41.489 register 0x200000200000 2097152 00:05:41.489 malloc 3145728 00:05:41.489 register 0x200000400000 4194304 00:05:41.489 buf 0x200000500000 len 3145728 PASSED 00:05:41.489 malloc 64 00:05:41.489 buf 0x2000004fff40 len 64 PASSED 00:05:41.489 malloc 4194304 00:05:41.489 register 0x200000800000 6291456 00:05:41.489 buf 0x200000a00000 len 4194304 PASSED 00:05:41.489 free 0x200000500000 3145728 00:05:41.489 free 0x2000004fff40 64 00:05:41.489 unregister 0x200000400000 4194304 PASSED 00:05:41.489 free 0x200000a00000 4194304 00:05:41.489 unregister 0x200000800000 6291456 PASSED 00:05:41.489 malloc 8388608 00:05:41.489 register 0x200000400000 10485760 00:05:41.489 buf 0x200000600000 len 8388608 PASSED 00:05:41.489 free 0x200000600000 8388608 00:05:41.489 unregister 0x200000400000 10485760 PASSED 00:05:41.489 passed 00:05:41.489 00:05:41.489 Run Summary: Type Total Ran Passed Failed Inactive 00:05:41.489 suites 1 1 n/a 0 0 00:05:41.489 tests 1 1 1 0 0 00:05:41.489 asserts 15 15 15 0 n/a 00:05:41.489 00:05:41.489 Elapsed time = 0.005 seconds 00:05:41.489 00:05:41.489 real 0m0.061s 00:05:41.489 user 0m0.019s 00:05:41.489 sys 0m0.042s 00:05:41.489 16:14:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:41.489 16:14:09 -- common/autotest_common.sh@10 -- # set +x 00:05:41.489 ************************************ 00:05:41.489 END TEST env_mem_callbacks 00:05:41.489 ************************************ 00:05:41.489 00:05:41.489 real 0m6.401s 00:05:41.489 user 0m4.430s 00:05:41.489 sys 0m1.230s 00:05:41.489 16:14:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:41.489 16:14:09 -- common/autotest_common.sh@10 -- # set +x 00:05:41.489 ************************************ 00:05:41.489 END TEST env 00:05:41.489 ************************************ 00:05:41.489 16:14:09 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:41.489 16:14:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:41.489 16:14:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:41.489 16:14:09 -- common/autotest_common.sh@10 -- # set +x 00:05:41.489 ************************************ 00:05:41.489 START TEST rpc 00:05:41.489 ************************************ 00:05:41.489 16:14:09 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:41.489 * Looking for test storage... 00:05:41.489 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:41.489 16:14:10 -- rpc/rpc.sh@65 -- # spdk_pid=2247067 00:05:41.489 16:14:10 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:41.489 16:14:10 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:41.489 16:14:10 -- rpc/rpc.sh@67 -- # waitforlisten 2247067 00:05:41.489 16:14:10 -- common/autotest_common.sh@819 -- # '[' -z 2247067 ']' 00:05:41.489 16:14:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:41.489 16:14:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:41.489 16:14:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:41.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:41.489 16:14:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:41.489 16:14:10 -- common/autotest_common.sh@10 -- # set +x 00:05:41.489 [2024-07-20 16:14:10.081853] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:41.489 [2024-07-20 16:14:10.081945] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2247067 ] 00:05:41.489 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.489 [2024-07-20 16:14:10.150408] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.489 [2024-07-20 16:14:10.187628] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:41.489 [2024-07-20 16:14:10.187752] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:41.489 [2024-07-20 16:14:10.187764] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2247067' to capture a snapshot of events at runtime. 00:05:41.489 [2024-07-20 16:14:10.187774] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2247067 for offline analysis/debug. 00:05:41.489 [2024-07-20 16:14:10.187793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.421 16:14:10 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:42.421 16:14:10 -- common/autotest_common.sh@852 -- # return 0 00:05:42.421 16:14:10 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:42.421 16:14:10 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:42.421 16:14:10 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:42.421 16:14:10 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:42.421 16:14:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.421 16:14:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.421 16:14:10 -- common/autotest_common.sh@10 -- # set +x 00:05:42.421 ************************************ 00:05:42.421 START TEST rpc_integrity 00:05:42.421 ************************************ 00:05:42.421 16:14:10 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:42.421 16:14:10 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:42.421 16:14:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.421 16:14:10 -- common/autotest_common.sh@10 -- # set +x 00:05:42.421 16:14:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.421 16:14:10 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:42.421 16:14:10 -- rpc/rpc.sh@13 -- # jq length 00:05:42.421 16:14:10 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:42.421 16:14:10 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:42.421 16:14:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.421 16:14:10 -- common/autotest_common.sh@10 -- # set +x 00:05:42.421 16:14:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.422 16:14:10 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:42.422 16:14:10 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:42.422 16:14:10 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.422 16:14:10 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 16:14:10 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.422 16:14:10 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:42.422 { 00:05:42.422 "name": "Malloc0", 00:05:42.422 "aliases": [ 00:05:42.422 "81438322-758e-4e53-be34-863c76be5b41" 00:05:42.422 ], 00:05:42.422 "product_name": "Malloc disk", 00:05:42.422 "block_size": 512, 00:05:42.422 "num_blocks": 16384, 00:05:42.422 "uuid": "81438322-758e-4e53-be34-863c76be5b41", 00:05:42.422 "assigned_rate_limits": { 00:05:42.422 "rw_ios_per_sec": 0, 00:05:42.422 "rw_mbytes_per_sec": 0, 00:05:42.422 "r_mbytes_per_sec": 0, 00:05:42.422 "w_mbytes_per_sec": 0 00:05:42.422 }, 00:05:42.422 "claimed": false, 00:05:42.422 "zoned": false, 00:05:42.422 "supported_io_types": { 00:05:42.422 "read": true, 00:05:42.422 "write": true, 00:05:42.422 "unmap": true, 00:05:42.422 "write_zeroes": true, 00:05:42.422 "flush": true, 00:05:42.422 "reset": true, 00:05:42.422 "compare": false, 00:05:42.422 "compare_and_write": false, 00:05:42.422 "abort": true, 00:05:42.422 "nvme_admin": false, 00:05:42.422 "nvme_io": false 00:05:42.422 }, 00:05:42.422 "memory_domains": [ 00:05:42.422 { 00:05:42.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.422 "dma_device_type": 2 00:05:42.422 } 00:05:42.422 ], 00:05:42.422 "driver_specific": {} 00:05:42.422 } 00:05:42.422 ]' 00:05:42.422 16:14:10 -- rpc/rpc.sh@17 -- # jq length 00:05:42.422 16:14:11 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:42.422 16:14:11 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:42.422 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 [2024-07-20 16:14:11.015417] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:42.422 [2024-07-20 16:14:11.015460] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:42.422 [2024-07-20 16:14:11.015478] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x450e230 00:05:42.422 [2024-07-20 16:14:11.015488] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:42.422 [2024-07-20 16:14:11.016349] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:42.422 [2024-07-20 16:14:11.016372] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:42.422 Passthru0 00:05:42.422 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.422 16:14:11 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:42.422 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.422 16:14:11 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:42.422 { 00:05:42.422 "name": "Malloc0", 00:05:42.422 "aliases": [ 00:05:42.422 "81438322-758e-4e53-be34-863c76be5b41" 00:05:42.422 ], 00:05:42.422 "product_name": "Malloc disk", 00:05:42.422 "block_size": 512, 00:05:42.422 "num_blocks": 16384, 00:05:42.422 "uuid": "81438322-758e-4e53-be34-863c76be5b41", 00:05:42.422 "assigned_rate_limits": { 00:05:42.422 "rw_ios_per_sec": 0, 00:05:42.422 "rw_mbytes_per_sec": 0, 00:05:42.422 "r_mbytes_per_sec": 0, 00:05:42.422 "w_mbytes_per_sec": 0 00:05:42.422 }, 00:05:42.422 "claimed": true, 00:05:42.422 "claim_type": "exclusive_write", 00:05:42.422 "zoned": false, 00:05:42.422 "supported_io_types": { 00:05:42.422 "read": true, 00:05:42.422 "write": true, 00:05:42.422 "unmap": true, 00:05:42.422 "write_zeroes": true, 00:05:42.422 "flush": true, 00:05:42.422 "reset": true, 00:05:42.422 "compare": false, 00:05:42.422 "compare_and_write": false, 00:05:42.422 "abort": true, 00:05:42.422 "nvme_admin": false, 00:05:42.422 "nvme_io": false 00:05:42.422 }, 00:05:42.422 "memory_domains": [ 00:05:42.422 { 00:05:42.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.422 "dma_device_type": 2 00:05:42.422 } 00:05:42.422 ], 00:05:42.422 "driver_specific": {} 00:05:42.422 }, 00:05:42.422 { 00:05:42.422 "name": "Passthru0", 00:05:42.422 "aliases": [ 00:05:42.422 "4da29981-6fe8-5ff6-b910-540fe3ea02c1" 00:05:42.422 ], 00:05:42.422 "product_name": "passthru", 00:05:42.422 "block_size": 512, 00:05:42.422 "num_blocks": 16384, 00:05:42.422 "uuid": "4da29981-6fe8-5ff6-b910-540fe3ea02c1", 00:05:42.422 "assigned_rate_limits": { 00:05:42.422 "rw_ios_per_sec": 0, 00:05:42.422 "rw_mbytes_per_sec": 0, 00:05:42.422 "r_mbytes_per_sec": 0, 00:05:42.422 "w_mbytes_per_sec": 0 00:05:42.422 }, 00:05:42.422 "claimed": false, 00:05:42.422 "zoned": false, 00:05:42.422 "supported_io_types": { 00:05:42.422 "read": true, 00:05:42.422 "write": true, 00:05:42.422 "unmap": true, 00:05:42.422 "write_zeroes": true, 00:05:42.422 "flush": true, 00:05:42.422 "reset": true, 00:05:42.422 "compare": false, 00:05:42.422 "compare_and_write": false, 00:05:42.422 "abort": true, 00:05:42.422 "nvme_admin": false, 00:05:42.422 "nvme_io": false 00:05:42.422 }, 00:05:42.422 "memory_domains": [ 00:05:42.422 { 00:05:42.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.422 "dma_device_type": 2 00:05:42.422 } 00:05:42.422 ], 00:05:42.422 "driver_specific": { 00:05:42.422 "passthru": { 00:05:42.422 "name": "Passthru0", 00:05:42.422 "base_bdev_name": "Malloc0" 00:05:42.422 } 00:05:42.422 } 00:05:42.422 } 00:05:42.422 ]' 00:05:42.422 16:14:11 -- rpc/rpc.sh@21 -- # jq length 00:05:42.422 16:14:11 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:42.422 16:14:11 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:42.422 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.422 16:14:11 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:42.422 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.422 16:14:11 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:42.422 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.422 16:14:11 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:42.422 16:14:11 -- rpc/rpc.sh@26 -- # jq length 00:05:42.422 16:14:11 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:42.422 00:05:42.422 real 0m0.274s 00:05:42.422 user 0m0.172s 00:05:42.422 sys 0m0.042s 00:05:42.422 16:14:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 ************************************ 00:05:42.422 END TEST rpc_integrity 00:05:42.422 ************************************ 00:05:42.422 16:14:11 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:42.422 16:14:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.422 16:14:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 ************************************ 00:05:42.422 START TEST rpc_plugins 00:05:42.422 ************************************ 00:05:42.422 16:14:11 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:42.422 16:14:11 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:42.422 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.422 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.422 16:14:11 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:42.422 16:14:11 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:42.422 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.422 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.679 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.679 16:14:11 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:42.679 { 00:05:42.679 "name": "Malloc1", 00:05:42.679 "aliases": [ 00:05:42.679 "87b21c3d-6594-4f33-9335-f6e39ab314f7" 00:05:42.679 ], 00:05:42.679 "product_name": "Malloc disk", 00:05:42.679 "block_size": 4096, 00:05:42.679 "num_blocks": 256, 00:05:42.679 "uuid": "87b21c3d-6594-4f33-9335-f6e39ab314f7", 00:05:42.679 "assigned_rate_limits": { 00:05:42.679 "rw_ios_per_sec": 0, 00:05:42.679 "rw_mbytes_per_sec": 0, 00:05:42.679 "r_mbytes_per_sec": 0, 00:05:42.679 "w_mbytes_per_sec": 0 00:05:42.679 }, 00:05:42.679 "claimed": false, 00:05:42.679 "zoned": false, 00:05:42.679 "supported_io_types": { 00:05:42.679 "read": true, 00:05:42.679 "write": true, 00:05:42.679 "unmap": true, 00:05:42.679 "write_zeroes": true, 00:05:42.679 "flush": true, 00:05:42.679 "reset": true, 00:05:42.679 "compare": false, 00:05:42.679 "compare_and_write": false, 00:05:42.679 "abort": true, 00:05:42.679 "nvme_admin": false, 00:05:42.679 "nvme_io": false 00:05:42.679 }, 00:05:42.679 "memory_domains": [ 00:05:42.679 { 00:05:42.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.679 "dma_device_type": 2 00:05:42.679 } 00:05:42.679 ], 00:05:42.679 "driver_specific": {} 00:05:42.679 } 00:05:42.679 ]' 00:05:42.679 16:14:11 -- rpc/rpc.sh@32 -- # jq length 00:05:42.679 16:14:11 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:42.679 16:14:11 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:42.679 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.679 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.679 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.679 16:14:11 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:42.679 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.679 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.679 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.679 16:14:11 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:42.679 16:14:11 -- rpc/rpc.sh@36 -- # jq length 00:05:42.679 16:14:11 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:42.679 00:05:42.679 real 0m0.138s 00:05:42.679 user 0m0.088s 00:05:42.679 sys 0m0.020s 00:05:42.679 16:14:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.679 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.679 ************************************ 00:05:42.679 END TEST rpc_plugins 00:05:42.679 ************************************ 00:05:42.679 16:14:11 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:42.680 16:14:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.680 16:14:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.680 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.680 ************************************ 00:05:42.680 START TEST rpc_trace_cmd_test 00:05:42.680 ************************************ 00:05:42.680 16:14:11 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:42.680 16:14:11 -- rpc/rpc.sh@40 -- # local info 00:05:42.680 16:14:11 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:42.680 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.680 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.680 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.680 16:14:11 -- rpc/rpc.sh@42 -- # info='{ 00:05:42.680 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2247067", 00:05:42.680 "tpoint_group_mask": "0x8", 00:05:42.680 "iscsi_conn": { 00:05:42.680 "mask": "0x2", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "scsi": { 00:05:42.680 "mask": "0x4", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "bdev": { 00:05:42.680 "mask": "0x8", 00:05:42.680 "tpoint_mask": "0xffffffffffffffff" 00:05:42.680 }, 00:05:42.680 "nvmf_rdma": { 00:05:42.680 "mask": "0x10", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "nvmf_tcp": { 00:05:42.680 "mask": "0x20", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "ftl": { 00:05:42.680 "mask": "0x40", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "blobfs": { 00:05:42.680 "mask": "0x80", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "dsa": { 00:05:42.680 "mask": "0x200", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "thread": { 00:05:42.680 "mask": "0x400", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "nvme_pcie": { 00:05:42.680 "mask": "0x800", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "iaa": { 00:05:42.680 "mask": "0x1000", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "nvme_tcp": { 00:05:42.680 "mask": "0x2000", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 }, 00:05:42.680 "bdev_nvme": { 00:05:42.680 "mask": "0x4000", 00:05:42.680 "tpoint_mask": "0x0" 00:05:42.680 } 00:05:42.680 }' 00:05:42.680 16:14:11 -- rpc/rpc.sh@43 -- # jq length 00:05:42.680 16:14:11 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:42.680 16:14:11 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:42.936 16:14:11 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:42.936 16:14:11 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:42.936 16:14:11 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:42.936 16:14:11 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:42.936 16:14:11 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:42.936 16:14:11 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:42.936 16:14:11 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:42.936 00:05:42.936 real 0m0.201s 00:05:42.936 user 0m0.160s 00:05:42.936 sys 0m0.033s 00:05:42.936 16:14:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.936 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.936 ************************************ 00:05:42.936 END TEST rpc_trace_cmd_test 00:05:42.936 ************************************ 00:05:42.936 16:14:11 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:42.936 16:14:11 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:42.936 16:14:11 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:42.936 16:14:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.936 16:14:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.936 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.936 ************************************ 00:05:42.936 START TEST rpc_daemon_integrity 00:05:42.936 ************************************ 00:05:42.936 16:14:11 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:42.936 16:14:11 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:42.936 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.936 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.936 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.936 16:14:11 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:42.936 16:14:11 -- rpc/rpc.sh@13 -- # jq length 00:05:42.936 16:14:11 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:42.936 16:14:11 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:42.936 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.936 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.937 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.937 16:14:11 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:42.937 16:14:11 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:42.937 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:42.937 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.937 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:42.937 16:14:11 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:42.937 { 00:05:42.937 "name": "Malloc2", 00:05:42.937 "aliases": [ 00:05:42.937 "bbe5c1be-18f7-4413-866c-4cc7bf752ebb" 00:05:42.937 ], 00:05:42.937 "product_name": "Malloc disk", 00:05:42.937 "block_size": 512, 00:05:42.937 "num_blocks": 16384, 00:05:42.937 "uuid": "bbe5c1be-18f7-4413-866c-4cc7bf752ebb", 00:05:42.937 "assigned_rate_limits": { 00:05:42.937 "rw_ios_per_sec": 0, 00:05:42.937 "rw_mbytes_per_sec": 0, 00:05:42.937 "r_mbytes_per_sec": 0, 00:05:42.937 "w_mbytes_per_sec": 0 00:05:42.937 }, 00:05:42.937 "claimed": false, 00:05:42.937 "zoned": false, 00:05:42.937 "supported_io_types": { 00:05:42.937 "read": true, 00:05:42.937 "write": true, 00:05:42.937 "unmap": true, 00:05:42.937 "write_zeroes": true, 00:05:42.937 "flush": true, 00:05:42.937 "reset": true, 00:05:42.937 "compare": false, 00:05:42.937 "compare_and_write": false, 00:05:42.937 "abort": true, 00:05:42.937 "nvme_admin": false, 00:05:42.937 "nvme_io": false 00:05:42.937 }, 00:05:42.937 "memory_domains": [ 00:05:42.937 { 00:05:42.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.937 "dma_device_type": 2 00:05:42.937 } 00:05:42.937 ], 00:05:42.937 "driver_specific": {} 00:05:42.937 } 00:05:42.937 ]' 00:05:42.937 16:14:11 -- rpc/rpc.sh@17 -- # jq length 00:05:43.193 16:14:11 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:43.193 16:14:11 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:43.193 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:43.193 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:43.193 [2024-07-20 16:14:11.781404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:43.193 [2024-07-20 16:14:11.781447] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:43.193 [2024-07-20 16:14:11.781464] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x43768d0 00:05:43.193 [2024-07-20 16:14:11.781474] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:43.193 [2024-07-20 16:14:11.782217] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:43.193 [2024-07-20 16:14:11.782239] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:43.193 Passthru0 00:05:43.193 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:43.193 16:14:11 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:43.193 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:43.193 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:43.193 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:43.193 16:14:11 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:43.193 { 00:05:43.193 "name": "Malloc2", 00:05:43.193 "aliases": [ 00:05:43.193 "bbe5c1be-18f7-4413-866c-4cc7bf752ebb" 00:05:43.193 ], 00:05:43.193 "product_name": "Malloc disk", 00:05:43.193 "block_size": 512, 00:05:43.193 "num_blocks": 16384, 00:05:43.193 "uuid": "bbe5c1be-18f7-4413-866c-4cc7bf752ebb", 00:05:43.193 "assigned_rate_limits": { 00:05:43.193 "rw_ios_per_sec": 0, 00:05:43.193 "rw_mbytes_per_sec": 0, 00:05:43.193 "r_mbytes_per_sec": 0, 00:05:43.193 "w_mbytes_per_sec": 0 00:05:43.193 }, 00:05:43.193 "claimed": true, 00:05:43.193 "claim_type": "exclusive_write", 00:05:43.193 "zoned": false, 00:05:43.193 "supported_io_types": { 00:05:43.193 "read": true, 00:05:43.193 "write": true, 00:05:43.193 "unmap": true, 00:05:43.193 "write_zeroes": true, 00:05:43.193 "flush": true, 00:05:43.193 "reset": true, 00:05:43.193 "compare": false, 00:05:43.193 "compare_and_write": false, 00:05:43.193 "abort": true, 00:05:43.193 "nvme_admin": false, 00:05:43.193 "nvme_io": false 00:05:43.193 }, 00:05:43.193 "memory_domains": [ 00:05:43.193 { 00:05:43.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:43.193 "dma_device_type": 2 00:05:43.193 } 00:05:43.193 ], 00:05:43.193 "driver_specific": {} 00:05:43.193 }, 00:05:43.193 { 00:05:43.193 "name": "Passthru0", 00:05:43.193 "aliases": [ 00:05:43.193 "65e10044-b9d1-555e-9a8d-244caf89b5b0" 00:05:43.193 ], 00:05:43.193 "product_name": "passthru", 00:05:43.193 "block_size": 512, 00:05:43.193 "num_blocks": 16384, 00:05:43.193 "uuid": "65e10044-b9d1-555e-9a8d-244caf89b5b0", 00:05:43.193 "assigned_rate_limits": { 00:05:43.193 "rw_ios_per_sec": 0, 00:05:43.193 "rw_mbytes_per_sec": 0, 00:05:43.193 "r_mbytes_per_sec": 0, 00:05:43.193 "w_mbytes_per_sec": 0 00:05:43.193 }, 00:05:43.193 "claimed": false, 00:05:43.193 "zoned": false, 00:05:43.193 "supported_io_types": { 00:05:43.193 "read": true, 00:05:43.193 "write": true, 00:05:43.193 "unmap": true, 00:05:43.193 "write_zeroes": true, 00:05:43.193 "flush": true, 00:05:43.193 "reset": true, 00:05:43.193 "compare": false, 00:05:43.193 "compare_and_write": false, 00:05:43.193 "abort": true, 00:05:43.193 "nvme_admin": false, 00:05:43.193 "nvme_io": false 00:05:43.193 }, 00:05:43.193 "memory_domains": [ 00:05:43.193 { 00:05:43.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:43.193 "dma_device_type": 2 00:05:43.193 } 00:05:43.193 ], 00:05:43.193 "driver_specific": { 00:05:43.193 "passthru": { 00:05:43.193 "name": "Passthru0", 00:05:43.193 "base_bdev_name": "Malloc2" 00:05:43.193 } 00:05:43.193 } 00:05:43.193 } 00:05:43.193 ]' 00:05:43.193 16:14:11 -- rpc/rpc.sh@21 -- # jq length 00:05:43.193 16:14:11 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:43.193 16:14:11 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:43.193 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:43.193 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:43.193 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:43.193 16:14:11 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:43.193 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:43.193 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:43.193 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:43.193 16:14:11 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:43.193 16:14:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:43.193 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:43.193 16:14:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:43.193 16:14:11 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:43.193 16:14:11 -- rpc/rpc.sh@26 -- # jq length 00:05:43.193 16:14:11 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:43.193 00:05:43.193 real 0m0.291s 00:05:43.193 user 0m0.187s 00:05:43.193 sys 0m0.040s 00:05:43.193 16:14:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.193 16:14:11 -- common/autotest_common.sh@10 -- # set +x 00:05:43.193 ************************************ 00:05:43.193 END TEST rpc_daemon_integrity 00:05:43.193 ************************************ 00:05:43.193 16:14:11 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:43.193 16:14:11 -- rpc/rpc.sh@84 -- # killprocess 2247067 00:05:43.193 16:14:11 -- common/autotest_common.sh@926 -- # '[' -z 2247067 ']' 00:05:43.193 16:14:11 -- common/autotest_common.sh@930 -- # kill -0 2247067 00:05:43.193 16:14:11 -- common/autotest_common.sh@931 -- # uname 00:05:43.193 16:14:11 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:43.193 16:14:11 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2247067 00:05:43.450 16:14:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:43.450 16:14:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:43.450 16:14:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2247067' 00:05:43.450 killing process with pid 2247067 00:05:43.450 16:14:12 -- common/autotest_common.sh@945 -- # kill 2247067 00:05:43.450 16:14:12 -- common/autotest_common.sh@950 -- # wait 2247067 00:05:43.708 00:05:43.708 real 0m2.357s 00:05:43.708 user 0m2.966s 00:05:43.708 sys 0m0.704s 00:05:43.708 16:14:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.708 16:14:12 -- common/autotest_common.sh@10 -- # set +x 00:05:43.708 ************************************ 00:05:43.708 END TEST rpc 00:05:43.708 ************************************ 00:05:43.708 16:14:12 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:43.708 16:14:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:43.708 16:14:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:43.708 16:14:12 -- common/autotest_common.sh@10 -- # set +x 00:05:43.708 ************************************ 00:05:43.708 START TEST rpc_client 00:05:43.708 ************************************ 00:05:43.708 16:14:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:43.708 * Looking for test storage... 00:05:43.708 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:43.708 16:14:12 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:43.708 OK 00:05:43.708 16:14:12 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:43.708 00:05:43.708 real 0m0.123s 00:05:43.708 user 0m0.057s 00:05:43.708 sys 0m0.076s 00:05:43.708 16:14:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.708 16:14:12 -- common/autotest_common.sh@10 -- # set +x 00:05:43.708 ************************************ 00:05:43.708 END TEST rpc_client 00:05:43.708 ************************************ 00:05:43.966 16:14:12 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:43.966 16:14:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:43.966 16:14:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:43.966 16:14:12 -- common/autotest_common.sh@10 -- # set +x 00:05:43.966 ************************************ 00:05:43.966 START TEST json_config 00:05:43.966 ************************************ 00:05:43.966 16:14:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:43.966 16:14:12 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:43.966 16:14:12 -- nvmf/common.sh@7 -- # uname -s 00:05:43.966 16:14:12 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:43.966 16:14:12 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:43.966 16:14:12 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:43.966 16:14:12 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:43.966 16:14:12 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:43.966 16:14:12 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:43.966 16:14:12 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:43.966 16:14:12 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:43.966 16:14:12 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:43.966 16:14:12 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:43.966 16:14:12 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:43.966 16:14:12 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:43.966 16:14:12 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:43.966 16:14:12 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:43.966 16:14:12 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:43.966 16:14:12 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:43.966 16:14:12 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:43.966 16:14:12 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:43.966 16:14:12 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:43.966 16:14:12 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:43.966 16:14:12 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:43.966 16:14:12 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:43.966 16:14:12 -- paths/export.sh@5 -- # export PATH 00:05:43.966 16:14:12 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:43.966 16:14:12 -- nvmf/common.sh@46 -- # : 0 00:05:43.966 16:14:12 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:43.966 16:14:12 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:43.966 16:14:12 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:43.966 16:14:12 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:43.966 16:14:12 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:43.966 16:14:12 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:43.966 16:14:12 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:43.966 16:14:12 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:43.966 16:14:12 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:43.966 16:14:12 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:43.966 16:14:12 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:43.966 16:14:12 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:43.966 16:14:12 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:43.966 WARNING: No tests are enabled so not running JSON configuration tests 00:05:43.966 16:14:12 -- json_config/json_config.sh@27 -- # exit 0 00:05:43.966 00:05:43.966 real 0m0.103s 00:05:43.966 user 0m0.051s 00:05:43.966 sys 0m0.054s 00:05:43.966 16:14:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.966 16:14:12 -- common/autotest_common.sh@10 -- # set +x 00:05:43.966 ************************************ 00:05:43.966 END TEST json_config 00:05:43.966 ************************************ 00:05:43.967 16:14:12 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:43.967 16:14:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:43.967 16:14:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:43.967 16:14:12 -- common/autotest_common.sh@10 -- # set +x 00:05:43.967 ************************************ 00:05:43.967 START TEST json_config_extra_key 00:05:43.967 ************************************ 00:05:43.967 16:14:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:43.967 16:14:12 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:43.967 16:14:12 -- nvmf/common.sh@7 -- # uname -s 00:05:43.967 16:14:12 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:43.967 16:14:12 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:43.967 16:14:12 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:43.967 16:14:12 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:43.967 16:14:12 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:43.967 16:14:12 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:43.967 16:14:12 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:43.967 16:14:12 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:43.967 16:14:12 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:43.967 16:14:12 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:44.223 16:14:12 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:44.224 16:14:12 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:44.224 16:14:12 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:44.224 16:14:12 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:44.224 16:14:12 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:44.224 16:14:12 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:44.224 16:14:12 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:44.224 16:14:12 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:44.224 16:14:12 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:44.224 16:14:12 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:44.224 16:14:12 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:44.224 16:14:12 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:44.224 16:14:12 -- paths/export.sh@5 -- # export PATH 00:05:44.224 16:14:12 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:44.224 16:14:12 -- nvmf/common.sh@46 -- # : 0 00:05:44.224 16:14:12 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:44.224 16:14:12 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:44.224 16:14:12 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:44.224 16:14:12 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:44.224 16:14:12 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:44.224 16:14:12 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:44.224 16:14:12 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:44.224 16:14:12 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:44.224 INFO: launching applications... 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=2247767 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:44.224 Waiting for target to run... 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 2247767 /var/tmp/spdk_tgt.sock 00:05:44.224 16:14:12 -- common/autotest_common.sh@819 -- # '[' -z 2247767 ']' 00:05:44.224 16:14:12 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:44.224 16:14:12 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:44.224 16:14:12 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:44.224 16:14:12 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:44.224 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:44.224 16:14:12 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:44.224 16:14:12 -- common/autotest_common.sh@10 -- # set +x 00:05:44.224 [2024-07-20 16:14:12.817004] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:44.224 [2024-07-20 16:14:12.817098] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2247767 ] 00:05:44.224 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.481 [2024-07-20 16:14:13.099869] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.481 [2024-07-20 16:14:13.119644] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:44.481 [2024-07-20 16:14:13.119754] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.046 16:14:13 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:45.046 16:14:13 -- common/autotest_common.sh@852 -- # return 0 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:45.046 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:45.046 INFO: shutting down applications... 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 2247767 ]] 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 2247767 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2247767 00:05:45.046 16:14:13 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:45.612 16:14:14 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:45.612 16:14:14 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:45.612 16:14:14 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2247767 00:05:45.612 16:14:14 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:45.612 16:14:14 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:45.612 16:14:14 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:45.612 16:14:14 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:45.612 SPDK target shutdown done 00:05:45.612 16:14:14 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:45.612 Success 00:05:45.612 00:05:45.612 real 0m1.447s 00:05:45.612 user 0m1.174s 00:05:45.612 sys 0m0.400s 00:05:45.612 16:14:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:45.612 16:14:14 -- common/autotest_common.sh@10 -- # set +x 00:05:45.612 ************************************ 00:05:45.612 END TEST json_config_extra_key 00:05:45.612 ************************************ 00:05:45.612 16:14:14 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:45.612 16:14:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:45.612 16:14:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:45.612 16:14:14 -- common/autotest_common.sh@10 -- # set +x 00:05:45.612 ************************************ 00:05:45.612 START TEST alias_rpc 00:05:45.612 ************************************ 00:05:45.612 16:14:14 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:45.612 * Looking for test storage... 00:05:45.612 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:45.612 16:14:14 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:45.612 16:14:14 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:45.612 16:14:14 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2248051 00:05:45.612 16:14:14 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2248051 00:05:45.612 16:14:14 -- common/autotest_common.sh@819 -- # '[' -z 2248051 ']' 00:05:45.612 16:14:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:45.612 16:14:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:45.612 16:14:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:45.612 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:45.612 16:14:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:45.612 16:14:14 -- common/autotest_common.sh@10 -- # set +x 00:05:45.612 [2024-07-20 16:14:14.302775] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:45.612 [2024-07-20 16:14:14.302844] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2248051 ] 00:05:45.612 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.612 [2024-07-20 16:14:14.368646] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.612 [2024-07-20 16:14:14.407650] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:45.612 [2024-07-20 16:14:14.407760] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.543 16:14:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:46.543 16:14:15 -- common/autotest_common.sh@852 -- # return 0 00:05:46.543 16:14:15 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:46.543 16:14:15 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2248051 00:05:46.543 16:14:15 -- common/autotest_common.sh@926 -- # '[' -z 2248051 ']' 00:05:46.543 16:14:15 -- common/autotest_common.sh@930 -- # kill -0 2248051 00:05:46.543 16:14:15 -- common/autotest_common.sh@931 -- # uname 00:05:46.543 16:14:15 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:46.543 16:14:15 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2248051 00:05:46.800 16:14:15 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:46.800 16:14:15 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:46.800 16:14:15 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2248051' 00:05:46.800 killing process with pid 2248051 00:05:46.800 16:14:15 -- common/autotest_common.sh@945 -- # kill 2248051 00:05:46.800 16:14:15 -- common/autotest_common.sh@950 -- # wait 2248051 00:05:47.056 00:05:47.056 real 0m1.480s 00:05:47.056 user 0m1.569s 00:05:47.056 sys 0m0.444s 00:05:47.056 16:14:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.056 16:14:15 -- common/autotest_common.sh@10 -- # set +x 00:05:47.056 ************************************ 00:05:47.056 END TEST alias_rpc 00:05:47.056 ************************************ 00:05:47.056 16:14:15 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:47.056 16:14:15 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:47.056 16:14:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:47.056 16:14:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:47.056 16:14:15 -- common/autotest_common.sh@10 -- # set +x 00:05:47.056 ************************************ 00:05:47.056 START TEST spdkcli_tcp 00:05:47.056 ************************************ 00:05:47.056 16:14:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:47.056 * Looking for test storage... 00:05:47.056 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:47.056 16:14:15 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:47.056 16:14:15 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:47.056 16:14:15 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:47.056 16:14:15 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:47.056 16:14:15 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:47.056 16:14:15 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:47.056 16:14:15 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:47.056 16:14:15 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:47.056 16:14:15 -- common/autotest_common.sh@10 -- # set +x 00:05:47.056 16:14:15 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2248362 00:05:47.056 16:14:15 -- spdkcli/tcp.sh@27 -- # waitforlisten 2248362 00:05:47.056 16:14:15 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:47.056 16:14:15 -- common/autotest_common.sh@819 -- # '[' -z 2248362 ']' 00:05:47.056 16:14:15 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.056 16:14:15 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:47.056 16:14:15 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.056 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.056 16:14:15 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:47.056 16:14:15 -- common/autotest_common.sh@10 -- # set +x 00:05:47.056 [2024-07-20 16:14:15.842164] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:47.056 [2024-07-20 16:14:15.842256] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2248362 ] 00:05:47.314 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.314 [2024-07-20 16:14:15.911434] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:47.314 [2024-07-20 16:14:15.949418] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:47.314 [2024-07-20 16:14:15.949583] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:47.314 [2024-07-20 16:14:15.949586] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.879 16:14:16 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:47.879 16:14:16 -- common/autotest_common.sh@852 -- # return 0 00:05:47.879 16:14:16 -- spdkcli/tcp.sh@31 -- # socat_pid=2248492 00:05:47.879 16:14:16 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:47.879 16:14:16 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:48.137 [ 00:05:48.137 "spdk_get_version", 00:05:48.137 "rpc_get_methods", 00:05:48.137 "trace_get_info", 00:05:48.137 "trace_get_tpoint_group_mask", 00:05:48.137 "trace_disable_tpoint_group", 00:05:48.137 "trace_enable_tpoint_group", 00:05:48.137 "trace_clear_tpoint_mask", 00:05:48.137 "trace_set_tpoint_mask", 00:05:48.137 "vfu_tgt_set_base_path", 00:05:48.137 "framework_get_pci_devices", 00:05:48.137 "framework_get_config", 00:05:48.137 "framework_get_subsystems", 00:05:48.137 "iobuf_get_stats", 00:05:48.137 "iobuf_set_options", 00:05:48.137 "sock_set_default_impl", 00:05:48.137 "sock_impl_set_options", 00:05:48.137 "sock_impl_get_options", 00:05:48.137 "vmd_rescan", 00:05:48.137 "vmd_remove_device", 00:05:48.137 "vmd_enable", 00:05:48.137 "accel_get_stats", 00:05:48.137 "accel_set_options", 00:05:48.137 "accel_set_driver", 00:05:48.137 "accel_crypto_key_destroy", 00:05:48.137 "accel_crypto_keys_get", 00:05:48.137 "accel_crypto_key_create", 00:05:48.137 "accel_assign_opc", 00:05:48.137 "accel_get_module_info", 00:05:48.137 "accel_get_opc_assignments", 00:05:48.137 "notify_get_notifications", 00:05:48.137 "notify_get_types", 00:05:48.137 "bdev_get_histogram", 00:05:48.137 "bdev_enable_histogram", 00:05:48.137 "bdev_set_qos_limit", 00:05:48.137 "bdev_set_qd_sampling_period", 00:05:48.137 "bdev_get_bdevs", 00:05:48.137 "bdev_reset_iostat", 00:05:48.137 "bdev_get_iostat", 00:05:48.137 "bdev_examine", 00:05:48.137 "bdev_wait_for_examine", 00:05:48.137 "bdev_set_options", 00:05:48.137 "scsi_get_devices", 00:05:48.137 "thread_set_cpumask", 00:05:48.137 "framework_get_scheduler", 00:05:48.137 "framework_set_scheduler", 00:05:48.137 "framework_get_reactors", 00:05:48.137 "thread_get_io_channels", 00:05:48.137 "thread_get_pollers", 00:05:48.137 "thread_get_stats", 00:05:48.137 "framework_monitor_context_switch", 00:05:48.137 "spdk_kill_instance", 00:05:48.137 "log_enable_timestamps", 00:05:48.137 "log_get_flags", 00:05:48.137 "log_clear_flag", 00:05:48.137 "log_set_flag", 00:05:48.137 "log_get_level", 00:05:48.137 "log_set_level", 00:05:48.137 "log_get_print_level", 00:05:48.137 "log_set_print_level", 00:05:48.137 "framework_enable_cpumask_locks", 00:05:48.137 "framework_disable_cpumask_locks", 00:05:48.137 "framework_wait_init", 00:05:48.137 "framework_start_init", 00:05:48.137 "virtio_blk_create_transport", 00:05:48.137 "virtio_blk_get_transports", 00:05:48.137 "vhost_controller_set_coalescing", 00:05:48.137 "vhost_get_controllers", 00:05:48.137 "vhost_delete_controller", 00:05:48.137 "vhost_create_blk_controller", 00:05:48.137 "vhost_scsi_controller_remove_target", 00:05:48.137 "vhost_scsi_controller_add_target", 00:05:48.137 "vhost_start_scsi_controller", 00:05:48.137 "vhost_create_scsi_controller", 00:05:48.137 "ublk_recover_disk", 00:05:48.137 "ublk_get_disks", 00:05:48.137 "ublk_stop_disk", 00:05:48.137 "ublk_start_disk", 00:05:48.137 "ublk_destroy_target", 00:05:48.137 "ublk_create_target", 00:05:48.137 "nbd_get_disks", 00:05:48.137 "nbd_stop_disk", 00:05:48.137 "nbd_start_disk", 00:05:48.137 "env_dpdk_get_mem_stats", 00:05:48.137 "nvmf_subsystem_get_listeners", 00:05:48.137 "nvmf_subsystem_get_qpairs", 00:05:48.137 "nvmf_subsystem_get_controllers", 00:05:48.137 "nvmf_get_stats", 00:05:48.137 "nvmf_get_transports", 00:05:48.137 "nvmf_create_transport", 00:05:48.137 "nvmf_get_targets", 00:05:48.137 "nvmf_delete_target", 00:05:48.137 "nvmf_create_target", 00:05:48.137 "nvmf_subsystem_allow_any_host", 00:05:48.137 "nvmf_subsystem_remove_host", 00:05:48.137 "nvmf_subsystem_add_host", 00:05:48.137 "nvmf_subsystem_remove_ns", 00:05:48.137 "nvmf_subsystem_add_ns", 00:05:48.137 "nvmf_subsystem_listener_set_ana_state", 00:05:48.137 "nvmf_discovery_get_referrals", 00:05:48.137 "nvmf_discovery_remove_referral", 00:05:48.137 "nvmf_discovery_add_referral", 00:05:48.137 "nvmf_subsystem_remove_listener", 00:05:48.137 "nvmf_subsystem_add_listener", 00:05:48.137 "nvmf_delete_subsystem", 00:05:48.137 "nvmf_create_subsystem", 00:05:48.137 "nvmf_get_subsystems", 00:05:48.137 "nvmf_set_crdt", 00:05:48.137 "nvmf_set_config", 00:05:48.137 "nvmf_set_max_subsystems", 00:05:48.137 "iscsi_set_options", 00:05:48.137 "iscsi_get_auth_groups", 00:05:48.137 "iscsi_auth_group_remove_secret", 00:05:48.137 "iscsi_auth_group_add_secret", 00:05:48.137 "iscsi_delete_auth_group", 00:05:48.137 "iscsi_create_auth_group", 00:05:48.137 "iscsi_set_discovery_auth", 00:05:48.137 "iscsi_get_options", 00:05:48.137 "iscsi_target_node_request_logout", 00:05:48.137 "iscsi_target_node_set_redirect", 00:05:48.137 "iscsi_target_node_set_auth", 00:05:48.137 "iscsi_target_node_add_lun", 00:05:48.137 "iscsi_get_connections", 00:05:48.137 "iscsi_portal_group_set_auth", 00:05:48.137 "iscsi_start_portal_group", 00:05:48.137 "iscsi_delete_portal_group", 00:05:48.137 "iscsi_create_portal_group", 00:05:48.137 "iscsi_get_portal_groups", 00:05:48.137 "iscsi_delete_target_node", 00:05:48.137 "iscsi_target_node_remove_pg_ig_maps", 00:05:48.137 "iscsi_target_node_add_pg_ig_maps", 00:05:48.137 "iscsi_create_target_node", 00:05:48.137 "iscsi_get_target_nodes", 00:05:48.137 "iscsi_delete_initiator_group", 00:05:48.137 "iscsi_initiator_group_remove_initiators", 00:05:48.137 "iscsi_initiator_group_add_initiators", 00:05:48.137 "iscsi_create_initiator_group", 00:05:48.137 "iscsi_get_initiator_groups", 00:05:48.137 "vfu_virtio_create_scsi_endpoint", 00:05:48.137 "vfu_virtio_scsi_remove_target", 00:05:48.137 "vfu_virtio_scsi_add_target", 00:05:48.137 "vfu_virtio_create_blk_endpoint", 00:05:48.137 "vfu_virtio_delete_endpoint", 00:05:48.137 "iaa_scan_accel_module", 00:05:48.137 "dsa_scan_accel_module", 00:05:48.137 "ioat_scan_accel_module", 00:05:48.137 "accel_error_inject_error", 00:05:48.137 "bdev_iscsi_delete", 00:05:48.137 "bdev_iscsi_create", 00:05:48.137 "bdev_iscsi_set_options", 00:05:48.137 "bdev_virtio_attach_controller", 00:05:48.137 "bdev_virtio_scsi_get_devices", 00:05:48.137 "bdev_virtio_detach_controller", 00:05:48.137 "bdev_virtio_blk_set_hotplug", 00:05:48.137 "bdev_ftl_set_property", 00:05:48.137 "bdev_ftl_get_properties", 00:05:48.137 "bdev_ftl_get_stats", 00:05:48.137 "bdev_ftl_unmap", 00:05:48.137 "bdev_ftl_unload", 00:05:48.137 "bdev_ftl_delete", 00:05:48.137 "bdev_ftl_load", 00:05:48.137 "bdev_ftl_create", 00:05:48.137 "bdev_aio_delete", 00:05:48.137 "bdev_aio_rescan", 00:05:48.137 "bdev_aio_create", 00:05:48.137 "blobfs_create", 00:05:48.137 "blobfs_detect", 00:05:48.137 "blobfs_set_cache_size", 00:05:48.137 "bdev_zone_block_delete", 00:05:48.137 "bdev_zone_block_create", 00:05:48.137 "bdev_delay_delete", 00:05:48.137 "bdev_delay_create", 00:05:48.137 "bdev_delay_update_latency", 00:05:48.137 "bdev_split_delete", 00:05:48.137 "bdev_split_create", 00:05:48.137 "bdev_error_inject_error", 00:05:48.137 "bdev_error_delete", 00:05:48.137 "bdev_error_create", 00:05:48.137 "bdev_raid_set_options", 00:05:48.137 "bdev_raid_remove_base_bdev", 00:05:48.137 "bdev_raid_add_base_bdev", 00:05:48.137 "bdev_raid_delete", 00:05:48.137 "bdev_raid_create", 00:05:48.137 "bdev_raid_get_bdevs", 00:05:48.137 "bdev_lvol_grow_lvstore", 00:05:48.137 "bdev_lvol_get_lvols", 00:05:48.137 "bdev_lvol_get_lvstores", 00:05:48.137 "bdev_lvol_delete", 00:05:48.137 "bdev_lvol_set_read_only", 00:05:48.137 "bdev_lvol_resize", 00:05:48.137 "bdev_lvol_decouple_parent", 00:05:48.137 "bdev_lvol_inflate", 00:05:48.137 "bdev_lvol_rename", 00:05:48.137 "bdev_lvol_clone_bdev", 00:05:48.137 "bdev_lvol_clone", 00:05:48.137 "bdev_lvol_snapshot", 00:05:48.137 "bdev_lvol_create", 00:05:48.137 "bdev_lvol_delete_lvstore", 00:05:48.137 "bdev_lvol_rename_lvstore", 00:05:48.137 "bdev_lvol_create_lvstore", 00:05:48.137 "bdev_passthru_delete", 00:05:48.137 "bdev_passthru_create", 00:05:48.137 "bdev_nvme_cuse_unregister", 00:05:48.137 "bdev_nvme_cuse_register", 00:05:48.137 "bdev_opal_new_user", 00:05:48.137 "bdev_opal_set_lock_state", 00:05:48.137 "bdev_opal_delete", 00:05:48.137 "bdev_opal_get_info", 00:05:48.137 "bdev_opal_create", 00:05:48.137 "bdev_nvme_opal_revert", 00:05:48.137 "bdev_nvme_opal_init", 00:05:48.137 "bdev_nvme_send_cmd", 00:05:48.137 "bdev_nvme_get_path_iostat", 00:05:48.137 "bdev_nvme_get_mdns_discovery_info", 00:05:48.137 "bdev_nvme_stop_mdns_discovery", 00:05:48.137 "bdev_nvme_start_mdns_discovery", 00:05:48.137 "bdev_nvme_set_multipath_policy", 00:05:48.137 "bdev_nvme_set_preferred_path", 00:05:48.137 "bdev_nvme_get_io_paths", 00:05:48.137 "bdev_nvme_remove_error_injection", 00:05:48.137 "bdev_nvme_add_error_injection", 00:05:48.137 "bdev_nvme_get_discovery_info", 00:05:48.137 "bdev_nvme_stop_discovery", 00:05:48.137 "bdev_nvme_start_discovery", 00:05:48.137 "bdev_nvme_get_controller_health_info", 00:05:48.137 "bdev_nvme_disable_controller", 00:05:48.137 "bdev_nvme_enable_controller", 00:05:48.137 "bdev_nvme_reset_controller", 00:05:48.137 "bdev_nvme_get_transport_statistics", 00:05:48.137 "bdev_nvme_apply_firmware", 00:05:48.137 "bdev_nvme_detach_controller", 00:05:48.137 "bdev_nvme_get_controllers", 00:05:48.137 "bdev_nvme_attach_controller", 00:05:48.137 "bdev_nvme_set_hotplug", 00:05:48.137 "bdev_nvme_set_options", 00:05:48.137 "bdev_null_resize", 00:05:48.137 "bdev_null_delete", 00:05:48.137 "bdev_null_create", 00:05:48.137 "bdev_malloc_delete", 00:05:48.137 "bdev_malloc_create" 00:05:48.137 ] 00:05:48.137 16:14:16 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:48.137 16:14:16 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:48.137 16:14:16 -- common/autotest_common.sh@10 -- # set +x 00:05:48.138 16:14:16 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:48.138 16:14:16 -- spdkcli/tcp.sh@38 -- # killprocess 2248362 00:05:48.138 16:14:16 -- common/autotest_common.sh@926 -- # '[' -z 2248362 ']' 00:05:48.138 16:14:16 -- common/autotest_common.sh@930 -- # kill -0 2248362 00:05:48.138 16:14:16 -- common/autotest_common.sh@931 -- # uname 00:05:48.138 16:14:16 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:48.138 16:14:16 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2248362 00:05:48.138 16:14:16 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:48.138 16:14:16 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:48.138 16:14:16 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2248362' 00:05:48.138 killing process with pid 2248362 00:05:48.138 16:14:16 -- common/autotest_common.sh@945 -- # kill 2248362 00:05:48.138 16:14:16 -- common/autotest_common.sh@950 -- # wait 2248362 00:05:48.705 00:05:48.705 real 0m1.492s 00:05:48.705 user 0m2.779s 00:05:48.705 sys 0m0.480s 00:05:48.705 16:14:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.705 16:14:17 -- common/autotest_common.sh@10 -- # set +x 00:05:48.705 ************************************ 00:05:48.705 END TEST spdkcli_tcp 00:05:48.705 ************************************ 00:05:48.705 16:14:17 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:48.705 16:14:17 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.705 16:14:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.705 16:14:17 -- common/autotest_common.sh@10 -- # set +x 00:05:48.705 ************************************ 00:05:48.705 START TEST dpdk_mem_utility 00:05:48.705 ************************************ 00:05:48.705 16:14:17 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:48.705 * Looking for test storage... 00:05:48.705 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:48.705 16:14:17 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:48.705 16:14:17 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2248694 00:05:48.705 16:14:17 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.705 16:14:17 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2248694 00:05:48.705 16:14:17 -- common/autotest_common.sh@819 -- # '[' -z 2248694 ']' 00:05:48.705 16:14:17 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.705 16:14:17 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:48.705 16:14:17 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.705 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.705 16:14:17 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:48.705 16:14:17 -- common/autotest_common.sh@10 -- # set +x 00:05:48.705 [2024-07-20 16:14:17.369287] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:48.705 [2024-07-20 16:14:17.369381] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2248694 ] 00:05:48.705 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.705 [2024-07-20 16:14:17.439911] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.705 [2024-07-20 16:14:17.478101] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:48.705 [2024-07-20 16:14:17.478217] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.640 16:14:18 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:49.640 16:14:18 -- common/autotest_common.sh@852 -- # return 0 00:05:49.640 16:14:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:49.640 16:14:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:49.640 16:14:18 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.640 16:14:18 -- common/autotest_common.sh@10 -- # set +x 00:05:49.640 { 00:05:49.640 "filename": "/tmp/spdk_mem_dump.txt" 00:05:49.640 } 00:05:49.640 16:14:18 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.640 16:14:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:49.640 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:49.640 1 heaps totaling size 814.000000 MiB 00:05:49.640 size: 814.000000 MiB heap id: 0 00:05:49.640 end heaps---------- 00:05:49.640 8 mempools totaling size 598.116089 MiB 00:05:49.640 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:49.640 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:49.640 size: 84.521057 MiB name: bdev_io_2248694 00:05:49.640 size: 51.011292 MiB name: evtpool_2248694 00:05:49.640 size: 50.003479 MiB name: msgpool_2248694 00:05:49.640 size: 21.763794 MiB name: PDU_Pool 00:05:49.640 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:49.640 size: 0.026123 MiB name: Session_Pool 00:05:49.640 end mempools------- 00:05:49.640 6 memzones totaling size 4.142822 MiB 00:05:49.640 size: 1.000366 MiB name: RG_ring_0_2248694 00:05:49.640 size: 1.000366 MiB name: RG_ring_1_2248694 00:05:49.640 size: 1.000366 MiB name: RG_ring_4_2248694 00:05:49.640 size: 1.000366 MiB name: RG_ring_5_2248694 00:05:49.640 size: 0.125366 MiB name: RG_ring_2_2248694 00:05:49.640 size: 0.015991 MiB name: RG_ring_3_2248694 00:05:49.640 end memzones------- 00:05:49.640 16:14:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:49.640 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:49.640 list of free elements. size: 12.519348 MiB 00:05:49.640 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:49.640 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:49.640 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:49.640 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:49.640 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:49.640 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:49.640 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:49.640 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:49.640 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:49.640 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:49.640 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:49.640 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:49.640 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:49.640 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:49.640 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:49.640 list of standard malloc elements. size: 199.218079 MiB 00:05:49.640 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:49.640 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:49.640 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:49.640 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:49.640 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:49.640 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:49.640 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:49.640 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:49.640 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:49.640 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:49.640 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:49.640 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:49.640 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:49.640 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:49.640 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:49.640 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:49.640 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:49.640 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:49.640 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:49.640 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:49.640 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:49.640 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:49.641 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:49.641 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:49.641 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:49.641 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:49.641 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:49.641 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:49.641 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:49.641 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:49.641 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:49.641 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:49.641 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:49.641 list of memzone associated elements. size: 602.262573 MiB 00:05:49.641 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:49.641 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:49.641 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:49.641 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:49.641 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:49.641 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2248694_0 00:05:49.641 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:49.641 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2248694_0 00:05:49.641 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:49.641 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2248694_0 00:05:49.641 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:49.641 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:49.641 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:49.641 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:49.641 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:49.641 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2248694 00:05:49.641 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:49.641 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2248694 00:05:49.641 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:49.641 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2248694 00:05:49.641 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:49.641 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:49.641 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:49.641 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:49.641 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:49.641 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:49.641 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:49.641 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:49.641 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:49.641 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2248694 00:05:49.641 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:49.641 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2248694 00:05:49.641 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:49.641 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2248694 00:05:49.641 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:49.641 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2248694 00:05:49.641 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:49.641 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2248694 00:05:49.641 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:49.641 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:49.641 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:49.641 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:49.641 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:49.641 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:49.641 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:49.641 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2248694 00:05:49.641 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:49.641 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:49.641 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:49.641 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:49.641 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:49.641 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2248694 00:05:49.641 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:49.641 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:49.641 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:49.641 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2248694 00:05:49.641 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:49.641 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2248694 00:05:49.641 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:49.641 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:49.641 16:14:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:49.641 16:14:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2248694 00:05:49.641 16:14:18 -- common/autotest_common.sh@926 -- # '[' -z 2248694 ']' 00:05:49.641 16:14:18 -- common/autotest_common.sh@930 -- # kill -0 2248694 00:05:49.641 16:14:18 -- common/autotest_common.sh@931 -- # uname 00:05:49.641 16:14:18 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:49.641 16:14:18 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2248694 00:05:49.641 16:14:18 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:49.641 16:14:18 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:49.641 16:14:18 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2248694' 00:05:49.641 killing process with pid 2248694 00:05:49.641 16:14:18 -- common/autotest_common.sh@945 -- # kill 2248694 00:05:49.641 16:14:18 -- common/autotest_common.sh@950 -- # wait 2248694 00:05:49.899 00:05:49.899 real 0m1.379s 00:05:49.899 user 0m1.406s 00:05:49.899 sys 0m0.440s 00:05:49.900 16:14:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.900 16:14:18 -- common/autotest_common.sh@10 -- # set +x 00:05:49.900 ************************************ 00:05:49.900 END TEST dpdk_mem_utility 00:05:49.900 ************************************ 00:05:49.900 16:14:18 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:49.900 16:14:18 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:49.900 16:14:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:49.900 16:14:18 -- common/autotest_common.sh@10 -- # set +x 00:05:49.900 ************************************ 00:05:49.900 START TEST event 00:05:49.900 ************************************ 00:05:49.900 16:14:18 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:50.159 * Looking for test storage... 00:05:50.159 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:50.159 16:14:18 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:50.159 16:14:18 -- bdev/nbd_common.sh@6 -- # set -e 00:05:50.159 16:14:18 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:50.159 16:14:18 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:50.159 16:14:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:50.159 16:14:18 -- common/autotest_common.sh@10 -- # set +x 00:05:50.159 ************************************ 00:05:50.159 START TEST event_perf 00:05:50.159 ************************************ 00:05:50.159 16:14:18 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:50.159 Running I/O for 1 seconds...[2024-07-20 16:14:18.801523] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:50.159 [2024-07-20 16:14:18.801615] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2248970 ] 00:05:50.159 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.159 [2024-07-20 16:14:18.875289] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:50.159 [2024-07-20 16:14:18.913988] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:50.159 [2024-07-20 16:14:18.914081] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:50.159 [2024-07-20 16:14:18.914367] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:50.159 [2024-07-20 16:14:18.914370] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.537 Running I/O for 1 seconds... 00:05:51.537 lcore 0: 199970 00:05:51.537 lcore 1: 199969 00:05:51.537 lcore 2: 199971 00:05:51.537 lcore 3: 199969 00:05:51.537 done. 00:05:51.537 00:05:51.537 real 0m1.187s 00:05:51.537 user 0m4.092s 00:05:51.537 sys 0m0.092s 00:05:51.537 16:14:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.537 16:14:19 -- common/autotest_common.sh@10 -- # set +x 00:05:51.537 ************************************ 00:05:51.537 END TEST event_perf 00:05:51.537 ************************************ 00:05:51.537 16:14:20 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:51.537 16:14:20 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:51.537 16:14:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:51.537 16:14:20 -- common/autotest_common.sh@10 -- # set +x 00:05:51.537 ************************************ 00:05:51.537 START TEST event_reactor 00:05:51.537 ************************************ 00:05:51.537 16:14:20 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:51.537 [2024-07-20 16:14:20.037555] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:51.537 [2024-07-20 16:14:20.037651] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2249182 ] 00:05:51.537 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.537 [2024-07-20 16:14:20.108022] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.537 [2024-07-20 16:14:20.144792] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.474 test_start 00:05:52.474 oneshot 00:05:52.474 tick 100 00:05:52.474 tick 100 00:05:52.474 tick 250 00:05:52.474 tick 100 00:05:52.474 tick 100 00:05:52.474 tick 250 00:05:52.474 tick 500 00:05:52.474 tick 100 00:05:52.474 tick 100 00:05:52.474 tick 100 00:05:52.474 tick 250 00:05:52.474 tick 100 00:05:52.474 tick 100 00:05:52.474 test_end 00:05:52.474 00:05:52.474 real 0m1.176s 00:05:52.474 user 0m1.080s 00:05:52.474 sys 0m0.091s 00:05:52.474 16:14:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:52.474 16:14:21 -- common/autotest_common.sh@10 -- # set +x 00:05:52.474 ************************************ 00:05:52.474 END TEST event_reactor 00:05:52.474 ************************************ 00:05:52.474 16:14:21 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:52.474 16:14:21 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:52.474 16:14:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:52.474 16:14:21 -- common/autotest_common.sh@10 -- # set +x 00:05:52.474 ************************************ 00:05:52.474 START TEST event_reactor_perf 00:05:52.474 ************************************ 00:05:52.474 16:14:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:52.474 [2024-07-20 16:14:21.266201] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:52.474 [2024-07-20 16:14:21.266293] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2249466 ] 00:05:52.733 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.733 [2024-07-20 16:14:21.336380] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.733 [2024-07-20 16:14:21.371437] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.669 test_start 00:05:53.669 test_end 00:05:53.669 Performance: 953762 events per second 00:05:53.669 00:05:53.669 real 0m1.177s 00:05:53.669 user 0m1.085s 00:05:53.669 sys 0m0.088s 00:05:53.669 16:14:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:53.669 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:53.669 ************************************ 00:05:53.669 END TEST event_reactor_perf 00:05:53.669 ************************************ 00:05:53.669 16:14:22 -- event/event.sh@49 -- # uname -s 00:05:53.928 16:14:22 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:53.928 16:14:22 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:53.928 16:14:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:53.928 16:14:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:53.928 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:53.928 ************************************ 00:05:53.928 START TEST event_scheduler 00:05:53.928 ************************************ 00:05:53.928 16:14:22 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:53.928 * Looking for test storage... 00:05:53.928 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:53.928 16:14:22 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:53.928 16:14:22 -- scheduler/scheduler.sh@35 -- # scheduler_pid=2249773 00:05:53.928 16:14:22 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:53.928 16:14:22 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:53.928 16:14:22 -- scheduler/scheduler.sh@37 -- # waitforlisten 2249773 00:05:53.928 16:14:22 -- common/autotest_common.sh@819 -- # '[' -z 2249773 ']' 00:05:53.928 16:14:22 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:53.928 16:14:22 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:53.928 16:14:22 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:53.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:53.928 16:14:22 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:53.928 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:53.928 [2024-07-20 16:14:22.596689] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:53.928 [2024-07-20 16:14:22.596768] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2249773 ] 00:05:53.928 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.928 [2024-07-20 16:14:22.661291] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:53.928 [2024-07-20 16:14:22.700407] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.928 [2024-07-20 16:14:22.700492] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:53.928 [2024-07-20 16:14:22.700514] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:53.928 [2024-07-20 16:14:22.700517] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:54.187 16:14:22 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:54.187 16:14:22 -- common/autotest_common.sh@852 -- # return 0 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 POWER: Env isn't set yet! 00:05:54.187 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:54.187 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:54.187 POWER: Cannot set governor of lcore 0 to userspace 00:05:54.187 POWER: Attempting to initialise PSTAT power management... 00:05:54.187 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:54.187 POWER: Initialized successfully for lcore 0 power management 00:05:54.187 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:54.187 POWER: Initialized successfully for lcore 1 power management 00:05:54.187 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:54.187 POWER: Initialized successfully for lcore 2 power management 00:05:54.187 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:54.187 POWER: Initialized successfully for lcore 3 power management 00:05:54.187 [2024-07-20 16:14:22.804628] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:54.187 [2024-07-20 16:14:22.804644] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:54.187 [2024-07-20 16:14:22.804655] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 [2024-07-20 16:14:22.866801] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:54.187 16:14:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:54.187 16:14:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 ************************************ 00:05:54.187 START TEST scheduler_create_thread 00:05:54.187 ************************************ 00:05:54.187 16:14:22 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 2 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 3 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 4 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 5 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 6 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 7 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.187 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.187 8 00:05:54.187 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.187 16:14:22 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:54.187 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.188 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.188 9 00:05:54.188 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.188 16:14:22 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:54.188 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.188 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.188 10 00:05:54.188 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.188 16:14:22 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:54.188 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.188 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:54.188 16:14:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:54.188 16:14:22 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:54.188 16:14:22 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:54.188 16:14:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:54.188 16:14:22 -- common/autotest_common.sh@10 -- # set +x 00:05:55.123 16:14:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:55.123 16:14:23 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:55.123 16:14:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:55.123 16:14:23 -- common/autotest_common.sh@10 -- # set +x 00:05:56.499 16:14:25 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:56.499 16:14:25 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:56.499 16:14:25 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:56.499 16:14:25 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:56.499 16:14:25 -- common/autotest_common.sh@10 -- # set +x 00:05:57.515 16:14:26 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:57.515 00:05:57.515 real 0m3.382s 00:05:57.515 user 0m0.021s 00:05:57.515 sys 0m0.009s 00:05:57.515 16:14:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:57.515 16:14:26 -- common/autotest_common.sh@10 -- # set +x 00:05:57.515 ************************************ 00:05:57.515 END TEST scheduler_create_thread 00:05:57.515 ************************************ 00:05:57.515 16:14:26 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:57.515 16:14:26 -- scheduler/scheduler.sh@46 -- # killprocess 2249773 00:05:57.515 16:14:26 -- common/autotest_common.sh@926 -- # '[' -z 2249773 ']' 00:05:57.515 16:14:26 -- common/autotest_common.sh@930 -- # kill -0 2249773 00:05:57.515 16:14:26 -- common/autotest_common.sh@931 -- # uname 00:05:57.515 16:14:26 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:57.515 16:14:26 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2249773 00:05:57.795 16:14:26 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:57.795 16:14:26 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:57.795 16:14:26 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2249773' 00:05:57.795 killing process with pid 2249773 00:05:57.795 16:14:26 -- common/autotest_common.sh@945 -- # kill 2249773 00:05:57.795 16:14:26 -- common/autotest_common.sh@950 -- # wait 2249773 00:05:58.054 [2024-07-20 16:14:26.638572] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:58.054 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:58.054 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:58.054 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:58.054 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:58.054 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:58.054 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:58.054 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:58.054 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:58.054 00:05:58.054 real 0m4.370s 00:05:58.054 user 0m7.761s 00:05:58.054 sys 0m0.358s 00:05:58.054 16:14:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:58.054 16:14:26 -- common/autotest_common.sh@10 -- # set +x 00:05:58.054 ************************************ 00:05:58.054 END TEST event_scheduler 00:05:58.054 ************************************ 00:05:58.313 16:14:26 -- event/event.sh@51 -- # modprobe -n nbd 00:05:58.313 16:14:26 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:58.313 16:14:26 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:58.313 16:14:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:58.313 16:14:26 -- common/autotest_common.sh@10 -- # set +x 00:05:58.313 ************************************ 00:05:58.313 START TEST app_repeat 00:05:58.313 ************************************ 00:05:58.313 16:14:26 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:05:58.313 16:14:26 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.313 16:14:26 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.313 16:14:26 -- event/event.sh@13 -- # local nbd_list 00:05:58.313 16:14:26 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:58.313 16:14:26 -- event/event.sh@14 -- # local bdev_list 00:05:58.313 16:14:26 -- event/event.sh@15 -- # local repeat_times=4 00:05:58.313 16:14:26 -- event/event.sh@17 -- # modprobe nbd 00:05:58.313 16:14:26 -- event/event.sh@19 -- # repeat_pid=2250630 00:05:58.313 16:14:26 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:58.313 16:14:26 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:58.313 16:14:26 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2250630' 00:05:58.313 Process app_repeat pid: 2250630 00:05:58.313 16:14:26 -- event/event.sh@23 -- # for i in {0..2} 00:05:58.313 16:14:26 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:58.313 spdk_app_start Round 0 00:05:58.313 16:14:26 -- event/event.sh@25 -- # waitforlisten 2250630 /var/tmp/spdk-nbd.sock 00:05:58.313 16:14:26 -- common/autotest_common.sh@819 -- # '[' -z 2250630 ']' 00:05:58.313 16:14:26 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:58.313 16:14:26 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:58.313 16:14:26 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:58.313 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:58.313 16:14:26 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:58.313 16:14:26 -- common/autotest_common.sh@10 -- # set +x 00:05:58.313 [2024-07-20 16:14:26.941013] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:05:58.313 [2024-07-20 16:14:26.941107] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2250630 ] 00:05:58.313 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.313 [2024-07-20 16:14:27.013536] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:58.313 [2024-07-20 16:14:27.051033] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:58.313 [2024-07-20 16:14:27.051035] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.249 16:14:27 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:59.249 16:14:27 -- common/autotest_common.sh@852 -- # return 0 00:05:59.250 16:14:27 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:59.250 Malloc0 00:05:59.250 16:14:27 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:59.508 Malloc1 00:05:59.508 16:14:28 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@12 -- # local i 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:59.508 16:14:28 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:59.508 /dev/nbd0 00:05:59.509 16:14:28 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:59.509 16:14:28 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:59.509 16:14:28 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:59.509 16:14:28 -- common/autotest_common.sh@857 -- # local i 00:05:59.509 16:14:28 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:59.509 16:14:28 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:59.509 16:14:28 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:59.509 16:14:28 -- common/autotest_common.sh@861 -- # break 00:05:59.509 16:14:28 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:59.509 16:14:28 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:59.509 16:14:28 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:59.509 1+0 records in 00:05:59.509 1+0 records out 00:05:59.509 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000136737 s, 30.0 MB/s 00:05:59.768 16:14:28 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.768 16:14:28 -- common/autotest_common.sh@874 -- # size=4096 00:05:59.768 16:14:28 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.768 16:14:28 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:59.768 16:14:28 -- common/autotest_common.sh@877 -- # return 0 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:59.768 /dev/nbd1 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:59.768 16:14:28 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:59.768 16:14:28 -- common/autotest_common.sh@857 -- # local i 00:05:59.768 16:14:28 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:59.768 16:14:28 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:59.768 16:14:28 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:59.768 16:14:28 -- common/autotest_common.sh@861 -- # break 00:05:59.768 16:14:28 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:59.768 16:14:28 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:59.768 16:14:28 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:59.768 1+0 records in 00:05:59.768 1+0 records out 00:05:59.768 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027286 s, 15.0 MB/s 00:05:59.768 16:14:28 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.768 16:14:28 -- common/autotest_common.sh@874 -- # size=4096 00:05:59.768 16:14:28 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.768 16:14:28 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:59.768 16:14:28 -- common/autotest_common.sh@877 -- # return 0 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.768 16:14:28 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:00.027 { 00:06:00.027 "nbd_device": "/dev/nbd0", 00:06:00.027 "bdev_name": "Malloc0" 00:06:00.027 }, 00:06:00.027 { 00:06:00.027 "nbd_device": "/dev/nbd1", 00:06:00.027 "bdev_name": "Malloc1" 00:06:00.027 } 00:06:00.027 ]' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:00.027 { 00:06:00.027 "nbd_device": "/dev/nbd0", 00:06:00.027 "bdev_name": "Malloc0" 00:06:00.027 }, 00:06:00.027 { 00:06:00.027 "nbd_device": "/dev/nbd1", 00:06:00.027 "bdev_name": "Malloc1" 00:06:00.027 } 00:06:00.027 ]' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:00.027 /dev/nbd1' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:00.027 /dev/nbd1' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@65 -- # count=2 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@95 -- # count=2 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:00.027 256+0 records in 00:06:00.027 256+0 records out 00:06:00.027 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103463 s, 101 MB/s 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:00.027 256+0 records in 00:06:00.027 256+0 records out 00:06:00.027 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199082 s, 52.7 MB/s 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:00.027 256+0 records in 00:06:00.027 256+0 records out 00:06:00.027 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0215069 s, 48.8 MB/s 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:00.027 16:14:28 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:00.286 16:14:28 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:00.286 16:14:28 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:00.286 16:14:28 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:00.286 16:14:28 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:00.286 16:14:28 -- bdev/nbd_common.sh@51 -- # local i 00:06:00.286 16:14:28 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:00.286 16:14:28 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@41 -- # break 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@45 -- # return 0 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:00.286 16:14:29 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@41 -- # break 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@45 -- # return 0 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:00.545 16:14:29 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@65 -- # true 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@65 -- # count=0 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@104 -- # count=0 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:00.804 16:14:29 -- bdev/nbd_common.sh@109 -- # return 0 00:06:00.804 16:14:29 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:01.064 16:14:29 -- event/event.sh@35 -- # sleep 3 00:06:01.064 [2024-07-20 16:14:29.802945] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:01.064 [2024-07-20 16:14:29.835521] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:01.064 [2024-07-20 16:14:29.835523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.323 [2024-07-20 16:14:29.876546] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:01.323 [2024-07-20 16:14:29.876591] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:03.859 16:14:32 -- event/event.sh@23 -- # for i in {0..2} 00:06:03.859 16:14:32 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:03.859 spdk_app_start Round 1 00:06:03.859 16:14:32 -- event/event.sh@25 -- # waitforlisten 2250630 /var/tmp/spdk-nbd.sock 00:06:03.859 16:14:32 -- common/autotest_common.sh@819 -- # '[' -z 2250630 ']' 00:06:03.859 16:14:32 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:03.859 16:14:32 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:03.859 16:14:32 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:03.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:03.859 16:14:32 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:03.859 16:14:32 -- common/autotest_common.sh@10 -- # set +x 00:06:04.118 16:14:32 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:04.118 16:14:32 -- common/autotest_common.sh@852 -- # return 0 00:06:04.118 16:14:32 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:04.376 Malloc0 00:06:04.376 16:14:32 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:04.376 Malloc1 00:06:04.376 16:14:33 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@12 -- # local i 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.376 16:14:33 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:04.634 /dev/nbd0 00:06:04.634 16:14:33 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:04.634 16:14:33 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:04.634 16:14:33 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:04.634 16:14:33 -- common/autotest_common.sh@857 -- # local i 00:06:04.634 16:14:33 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:04.634 16:14:33 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:04.634 16:14:33 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:04.634 16:14:33 -- common/autotest_common.sh@861 -- # break 00:06:04.634 16:14:33 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:04.634 16:14:33 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:04.634 16:14:33 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:04.634 1+0 records in 00:06:04.634 1+0 records out 00:06:04.634 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221796 s, 18.5 MB/s 00:06:04.634 16:14:33 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.634 16:14:33 -- common/autotest_common.sh@874 -- # size=4096 00:06:04.634 16:14:33 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.634 16:14:33 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:04.634 16:14:33 -- common/autotest_common.sh@877 -- # return 0 00:06:04.634 16:14:33 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:04.634 16:14:33 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.634 16:14:33 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:04.892 /dev/nbd1 00:06:04.892 16:14:33 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:04.892 16:14:33 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:04.892 16:14:33 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:04.892 16:14:33 -- common/autotest_common.sh@857 -- # local i 00:06:04.892 16:14:33 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:04.892 16:14:33 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:04.892 16:14:33 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:04.892 16:14:33 -- common/autotest_common.sh@861 -- # break 00:06:04.892 16:14:33 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:04.892 16:14:33 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:04.892 16:14:33 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:04.892 1+0 records in 00:06:04.892 1+0 records out 00:06:04.892 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000185158 s, 22.1 MB/s 00:06:04.892 16:14:33 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.892 16:14:33 -- common/autotest_common.sh@874 -- # size=4096 00:06:04.892 16:14:33 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.892 16:14:33 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:04.892 16:14:33 -- common/autotest_common.sh@877 -- # return 0 00:06:04.892 16:14:33 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:04.892 16:14:33 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.892 16:14:33 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:04.892 16:14:33 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.892 16:14:33 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:05.150 { 00:06:05.150 "nbd_device": "/dev/nbd0", 00:06:05.150 "bdev_name": "Malloc0" 00:06:05.150 }, 00:06:05.150 { 00:06:05.150 "nbd_device": "/dev/nbd1", 00:06:05.150 "bdev_name": "Malloc1" 00:06:05.150 } 00:06:05.150 ]' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:05.150 { 00:06:05.150 "nbd_device": "/dev/nbd0", 00:06:05.150 "bdev_name": "Malloc0" 00:06:05.150 }, 00:06:05.150 { 00:06:05.150 "nbd_device": "/dev/nbd1", 00:06:05.150 "bdev_name": "Malloc1" 00:06:05.150 } 00:06:05.150 ]' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:05.150 /dev/nbd1' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:05.150 /dev/nbd1' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@65 -- # count=2 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@95 -- # count=2 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:05.150 256+0 records in 00:06:05.150 256+0 records out 00:06:05.150 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107783 s, 97.3 MB/s 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:05.150 256+0 records in 00:06:05.150 256+0 records out 00:06:05.150 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199063 s, 52.7 MB/s 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:05.150 256+0 records in 00:06:05.150 256+0 records out 00:06:05.150 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209363 s, 50.1 MB/s 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:05.150 16:14:33 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:05.151 16:14:33 -- bdev/nbd_common.sh@51 -- # local i 00:06:05.151 16:14:33 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:05.151 16:14:33 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@41 -- # break 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@45 -- # return 0 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:05.408 16:14:34 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@41 -- # break 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@45 -- # return 0 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@65 -- # true 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@65 -- # count=0 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@104 -- # count=0 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:05.666 16:14:34 -- bdev/nbd_common.sh@109 -- # return 0 00:06:05.666 16:14:34 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:05.925 16:14:34 -- event/event.sh@35 -- # sleep 3 00:06:06.184 [2024-07-20 16:14:34.810718] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:06.184 [2024-07-20 16:14:34.843231] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:06.184 [2024-07-20 16:14:34.843234] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.184 [2024-07-20 16:14:34.884120] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:06.184 [2024-07-20 16:14:34.884164] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:09.474 16:14:37 -- event/event.sh@23 -- # for i in {0..2} 00:06:09.474 16:14:37 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:09.474 spdk_app_start Round 2 00:06:09.474 16:14:37 -- event/event.sh@25 -- # waitforlisten 2250630 /var/tmp/spdk-nbd.sock 00:06:09.474 16:14:37 -- common/autotest_common.sh@819 -- # '[' -z 2250630 ']' 00:06:09.474 16:14:37 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:09.474 16:14:37 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:09.474 16:14:37 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:09.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:09.474 16:14:37 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:09.474 16:14:37 -- common/autotest_common.sh@10 -- # set +x 00:06:09.474 16:14:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:09.474 16:14:37 -- common/autotest_common.sh@852 -- # return 0 00:06:09.474 16:14:37 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:09.474 Malloc0 00:06:09.474 16:14:37 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:09.474 Malloc1 00:06:09.474 16:14:38 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@12 -- # local i 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:09.474 16:14:38 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:09.733 /dev/nbd0 00:06:09.733 16:14:38 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:09.733 16:14:38 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:09.733 16:14:38 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:09.733 16:14:38 -- common/autotest_common.sh@857 -- # local i 00:06:09.733 16:14:38 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:09.733 16:14:38 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:09.733 16:14:38 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:09.733 16:14:38 -- common/autotest_common.sh@861 -- # break 00:06:09.733 16:14:38 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:09.733 16:14:38 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:09.733 16:14:38 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:09.733 1+0 records in 00:06:09.733 1+0 records out 00:06:09.733 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254129 s, 16.1 MB/s 00:06:09.733 16:14:38 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.733 16:14:38 -- common/autotest_common.sh@874 -- # size=4096 00:06:09.733 16:14:38 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.733 16:14:38 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:09.733 16:14:38 -- common/autotest_common.sh@877 -- # return 0 00:06:09.733 16:14:38 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:09.733 16:14:38 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:09.733 16:14:38 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:09.992 /dev/nbd1 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:09.992 16:14:38 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:09.992 16:14:38 -- common/autotest_common.sh@857 -- # local i 00:06:09.992 16:14:38 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:09.992 16:14:38 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:09.992 16:14:38 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:09.992 16:14:38 -- common/autotest_common.sh@861 -- # break 00:06:09.992 16:14:38 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:09.992 16:14:38 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:09.992 16:14:38 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:09.992 1+0 records in 00:06:09.992 1+0 records out 00:06:09.992 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000200335 s, 20.4 MB/s 00:06:09.992 16:14:38 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.992 16:14:38 -- common/autotest_common.sh@874 -- # size=4096 00:06:09.992 16:14:38 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.992 16:14:38 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:09.992 16:14:38 -- common/autotest_common.sh@877 -- # return 0 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:09.992 { 00:06:09.992 "nbd_device": "/dev/nbd0", 00:06:09.992 "bdev_name": "Malloc0" 00:06:09.992 }, 00:06:09.992 { 00:06:09.992 "nbd_device": "/dev/nbd1", 00:06:09.992 "bdev_name": "Malloc1" 00:06:09.992 } 00:06:09.992 ]' 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:09.992 { 00:06:09.992 "nbd_device": "/dev/nbd0", 00:06:09.992 "bdev_name": "Malloc0" 00:06:09.992 }, 00:06:09.992 { 00:06:09.992 "nbd_device": "/dev/nbd1", 00:06:09.992 "bdev_name": "Malloc1" 00:06:09.992 } 00:06:09.992 ]' 00:06:09.992 16:14:38 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:10.251 /dev/nbd1' 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:10.251 /dev/nbd1' 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@65 -- # count=2 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@95 -- # count=2 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:10.251 256+0 records in 00:06:10.251 256+0 records out 00:06:10.251 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114285 s, 91.8 MB/s 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:10.251 256+0 records in 00:06:10.251 256+0 records out 00:06:10.251 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202269 s, 51.8 MB/s 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:10.251 256+0 records in 00:06:10.251 256+0 records out 00:06:10.251 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214368 s, 48.9 MB/s 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@51 -- # local i 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:10.251 16:14:38 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@41 -- # break 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@45 -- # return 0 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:10.510 16:14:39 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@41 -- # break 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@45 -- # return 0 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:10.511 16:14:39 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@65 -- # true 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@65 -- # count=0 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@104 -- # count=0 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:10.769 16:14:39 -- bdev/nbd_common.sh@109 -- # return 0 00:06:10.769 16:14:39 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:11.027 16:14:39 -- event/event.sh@35 -- # sleep 3 00:06:11.284 [2024-07-20 16:14:39.852282] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:11.284 [2024-07-20 16:14:39.884626] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:11.284 [2024-07-20 16:14:39.884629] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.284 [2024-07-20 16:14:39.925221] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:11.284 [2024-07-20 16:14:39.925265] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:14.588 16:14:42 -- event/event.sh@38 -- # waitforlisten 2250630 /var/tmp/spdk-nbd.sock 00:06:14.588 16:14:42 -- common/autotest_common.sh@819 -- # '[' -z 2250630 ']' 00:06:14.588 16:14:42 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:14.588 16:14:42 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:14.588 16:14:42 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:14.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:14.588 16:14:42 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:14.588 16:14:42 -- common/autotest_common.sh@10 -- # set +x 00:06:14.588 16:14:42 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:14.588 16:14:42 -- common/autotest_common.sh@852 -- # return 0 00:06:14.588 16:14:42 -- event/event.sh@39 -- # killprocess 2250630 00:06:14.588 16:14:42 -- common/autotest_common.sh@926 -- # '[' -z 2250630 ']' 00:06:14.588 16:14:42 -- common/autotest_common.sh@930 -- # kill -0 2250630 00:06:14.588 16:14:42 -- common/autotest_common.sh@931 -- # uname 00:06:14.588 16:14:42 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:14.588 16:14:42 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2250630 00:06:14.588 16:14:42 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:14.588 16:14:42 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:14.588 16:14:42 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2250630' 00:06:14.588 killing process with pid 2250630 00:06:14.588 16:14:42 -- common/autotest_common.sh@945 -- # kill 2250630 00:06:14.588 16:14:42 -- common/autotest_common.sh@950 -- # wait 2250630 00:06:14.588 spdk_app_start is called in Round 0. 00:06:14.588 Shutdown signal received, stop current app iteration 00:06:14.588 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 reinitialization... 00:06:14.588 spdk_app_start is called in Round 1. 00:06:14.588 Shutdown signal received, stop current app iteration 00:06:14.588 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 reinitialization... 00:06:14.588 spdk_app_start is called in Round 2. 00:06:14.588 Shutdown signal received, stop current app iteration 00:06:14.588 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 reinitialization... 00:06:14.588 spdk_app_start is called in Round 3. 00:06:14.588 Shutdown signal received, stop current app iteration 00:06:14.588 16:14:43 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:14.588 16:14:43 -- event/event.sh@42 -- # return 0 00:06:14.588 00:06:14.588 real 0m16.118s 00:06:14.588 user 0m34.210s 00:06:14.588 sys 0m3.109s 00:06:14.588 16:14:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.588 16:14:43 -- common/autotest_common.sh@10 -- # set +x 00:06:14.588 ************************************ 00:06:14.588 END TEST app_repeat 00:06:14.588 ************************************ 00:06:14.588 16:14:43 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:14.588 16:14:43 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:14.588 16:14:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:14.588 16:14:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:14.588 16:14:43 -- common/autotest_common.sh@10 -- # set +x 00:06:14.588 ************************************ 00:06:14.588 START TEST cpu_locks 00:06:14.588 ************************************ 00:06:14.588 16:14:43 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:14.588 * Looking for test storage... 00:06:14.588 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:14.588 16:14:43 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:14.588 16:14:43 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:14.588 16:14:43 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:14.588 16:14:43 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:14.588 16:14:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:14.588 16:14:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:14.588 16:14:43 -- common/autotest_common.sh@10 -- # set +x 00:06:14.588 ************************************ 00:06:14.588 START TEST default_locks 00:06:14.588 ************************************ 00:06:14.588 16:14:43 -- common/autotest_common.sh@1104 -- # default_locks 00:06:14.588 16:14:43 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=2253573 00:06:14.589 16:14:43 -- event/cpu_locks.sh@47 -- # waitforlisten 2253573 00:06:14.589 16:14:43 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:14.589 16:14:43 -- common/autotest_common.sh@819 -- # '[' -z 2253573 ']' 00:06:14.589 16:14:43 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.589 16:14:43 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:14.589 16:14:43 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.589 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.589 16:14:43 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:14.589 16:14:43 -- common/autotest_common.sh@10 -- # set +x 00:06:14.589 [2024-07-20 16:14:43.207851] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:14.589 [2024-07-20 16:14:43.207921] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2253573 ] 00:06:14.589 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.589 [2024-07-20 16:14:43.273764] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.589 [2024-07-20 16:14:43.310857] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:14.589 [2024-07-20 16:14:43.310971] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.525 16:14:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:15.525 16:14:44 -- common/autotest_common.sh@852 -- # return 0 00:06:15.525 16:14:44 -- event/cpu_locks.sh@49 -- # locks_exist 2253573 00:06:15.525 16:14:44 -- event/cpu_locks.sh@22 -- # lslocks -p 2253573 00:06:15.525 16:14:44 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:15.783 lslocks: write error 00:06:15.783 16:14:44 -- event/cpu_locks.sh@50 -- # killprocess 2253573 00:06:15.783 16:14:44 -- common/autotest_common.sh@926 -- # '[' -z 2253573 ']' 00:06:15.783 16:14:44 -- common/autotest_common.sh@930 -- # kill -0 2253573 00:06:15.783 16:14:44 -- common/autotest_common.sh@931 -- # uname 00:06:15.783 16:14:44 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:15.783 16:14:44 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2253573 00:06:15.783 16:14:44 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:15.783 16:14:44 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:15.783 16:14:44 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2253573' 00:06:15.783 killing process with pid 2253573 00:06:15.783 16:14:44 -- common/autotest_common.sh@945 -- # kill 2253573 00:06:15.783 16:14:44 -- common/autotest_common.sh@950 -- # wait 2253573 00:06:16.041 16:14:44 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 2253573 00:06:16.041 16:14:44 -- common/autotest_common.sh@640 -- # local es=0 00:06:16.041 16:14:44 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2253573 00:06:16.041 16:14:44 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:16.041 16:14:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:16.041 16:14:44 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:16.041 16:14:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:16.041 16:14:44 -- common/autotest_common.sh@643 -- # waitforlisten 2253573 00:06:16.041 16:14:44 -- common/autotest_common.sh@819 -- # '[' -z 2253573 ']' 00:06:16.041 16:14:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.041 16:14:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:16.041 16:14:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.041 16:14:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:16.041 16:14:44 -- common/autotest_common.sh@10 -- # set +x 00:06:16.041 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2253573) - No such process 00:06:16.041 ERROR: process (pid: 2253573) is no longer running 00:06:16.041 16:14:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:16.041 16:14:44 -- common/autotest_common.sh@852 -- # return 1 00:06:16.041 16:14:44 -- common/autotest_common.sh@643 -- # es=1 00:06:16.041 16:14:44 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:16.041 16:14:44 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:16.041 16:14:44 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:16.041 16:14:44 -- event/cpu_locks.sh@54 -- # no_locks 00:06:16.041 16:14:44 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:16.041 16:14:44 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:16.041 16:14:44 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:16.041 00:06:16.041 real 0m1.590s 00:06:16.041 user 0m1.642s 00:06:16.041 sys 0m0.568s 00:06:16.041 16:14:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.041 16:14:44 -- common/autotest_common.sh@10 -- # set +x 00:06:16.041 ************************************ 00:06:16.041 END TEST default_locks 00:06:16.041 ************************************ 00:06:16.041 16:14:44 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:16.041 16:14:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:16.041 16:14:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:16.041 16:14:44 -- common/autotest_common.sh@10 -- # set +x 00:06:16.041 ************************************ 00:06:16.041 START TEST default_locks_via_rpc 00:06:16.041 ************************************ 00:06:16.041 16:14:44 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:06:16.041 16:14:44 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=2253875 00:06:16.041 16:14:44 -- event/cpu_locks.sh@63 -- # waitforlisten 2253875 00:06:16.041 16:14:44 -- common/autotest_common.sh@819 -- # '[' -z 2253875 ']' 00:06:16.041 16:14:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.041 16:14:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:16.041 16:14:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.041 16:14:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:16.041 16:14:44 -- common/autotest_common.sh@10 -- # set +x 00:06:16.041 16:14:44 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:16.041 [2024-07-20 16:14:44.842787] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:16.041 [2024-07-20 16:14:44.842879] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2253875 ] 00:06:16.299 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.299 [2024-07-20 16:14:44.912035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.299 [2024-07-20 16:14:44.949070] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.299 [2024-07-20 16:14:44.949183] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.867 16:14:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:16.867 16:14:45 -- common/autotest_common.sh@852 -- # return 0 00:06:16.867 16:14:45 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:16.867 16:14:45 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:16.867 16:14:45 -- common/autotest_common.sh@10 -- # set +x 00:06:16.867 16:14:45 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:16.867 16:14:45 -- event/cpu_locks.sh@67 -- # no_locks 00:06:16.867 16:14:45 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:16.867 16:14:45 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:16.867 16:14:45 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:16.867 16:14:45 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:16.867 16:14:45 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:16.867 16:14:45 -- common/autotest_common.sh@10 -- # set +x 00:06:16.867 16:14:45 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:16.867 16:14:45 -- event/cpu_locks.sh@71 -- # locks_exist 2253875 00:06:16.867 16:14:45 -- event/cpu_locks.sh@22 -- # lslocks -p 2253875 00:06:16.867 16:14:45 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:17.435 16:14:46 -- event/cpu_locks.sh@73 -- # killprocess 2253875 00:06:17.435 16:14:46 -- common/autotest_common.sh@926 -- # '[' -z 2253875 ']' 00:06:17.435 16:14:46 -- common/autotest_common.sh@930 -- # kill -0 2253875 00:06:17.435 16:14:46 -- common/autotest_common.sh@931 -- # uname 00:06:17.435 16:14:46 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:17.435 16:14:46 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2253875 00:06:17.435 16:14:46 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:17.435 16:14:46 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:17.435 16:14:46 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2253875' 00:06:17.435 killing process with pid 2253875 00:06:17.435 16:14:46 -- common/autotest_common.sh@945 -- # kill 2253875 00:06:17.435 16:14:46 -- common/autotest_common.sh@950 -- # wait 2253875 00:06:17.693 00:06:17.693 real 0m1.534s 00:06:17.693 user 0m1.584s 00:06:17.693 sys 0m0.538s 00:06:17.693 16:14:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:17.693 16:14:46 -- common/autotest_common.sh@10 -- # set +x 00:06:17.693 ************************************ 00:06:17.693 END TEST default_locks_via_rpc 00:06:17.693 ************************************ 00:06:17.693 16:14:46 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:17.693 16:14:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:17.693 16:14:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:17.693 16:14:46 -- common/autotest_common.sh@10 -- # set +x 00:06:17.693 ************************************ 00:06:17.693 START TEST non_locking_app_on_locked_coremask 00:06:17.693 ************************************ 00:06:17.693 16:14:46 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:06:17.693 16:14:46 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=2254176 00:06:17.693 16:14:46 -- event/cpu_locks.sh@81 -- # waitforlisten 2254176 /var/tmp/spdk.sock 00:06:17.693 16:14:46 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:17.693 16:14:46 -- common/autotest_common.sh@819 -- # '[' -z 2254176 ']' 00:06:17.693 16:14:46 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:17.694 16:14:46 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:17.694 16:14:46 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:17.694 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:17.694 16:14:46 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:17.694 16:14:46 -- common/autotest_common.sh@10 -- # set +x 00:06:17.694 [2024-07-20 16:14:46.426614] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:17.694 [2024-07-20 16:14:46.426690] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2254176 ] 00:06:17.694 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.694 [2024-07-20 16:14:46.495043] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.952 [2024-07-20 16:14:46.529616] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:17.952 [2024-07-20 16:14:46.529750] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.519 16:14:47 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:18.519 16:14:47 -- common/autotest_common.sh@852 -- # return 0 00:06:18.519 16:14:47 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=2254439 00:06:18.519 16:14:47 -- event/cpu_locks.sh@85 -- # waitforlisten 2254439 /var/tmp/spdk2.sock 00:06:18.519 16:14:47 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:18.519 16:14:47 -- common/autotest_common.sh@819 -- # '[' -z 2254439 ']' 00:06:18.519 16:14:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:18.519 16:14:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:18.519 16:14:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:18.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:18.519 16:14:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:18.519 16:14:47 -- common/autotest_common.sh@10 -- # set +x 00:06:18.519 [2024-07-20 16:14:47.279286] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:18.519 [2024-07-20 16:14:47.279358] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2254439 ] 00:06:18.519 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.777 [2024-07-20 16:14:47.367850] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:18.777 [2024-07-20 16:14:47.367880] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.777 [2024-07-20 16:14:47.440026] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:18.777 [2024-07-20 16:14:47.440158] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.345 16:14:48 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:19.345 16:14:48 -- common/autotest_common.sh@852 -- # return 0 00:06:19.345 16:14:48 -- event/cpu_locks.sh@87 -- # locks_exist 2254176 00:06:19.345 16:14:48 -- event/cpu_locks.sh@22 -- # lslocks -p 2254176 00:06:19.345 16:14:48 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:20.721 lslocks: write error 00:06:20.721 16:14:49 -- event/cpu_locks.sh@89 -- # killprocess 2254176 00:06:20.721 16:14:49 -- common/autotest_common.sh@926 -- # '[' -z 2254176 ']' 00:06:20.721 16:14:49 -- common/autotest_common.sh@930 -- # kill -0 2254176 00:06:20.721 16:14:49 -- common/autotest_common.sh@931 -- # uname 00:06:20.721 16:14:49 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:20.721 16:14:49 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2254176 00:06:20.721 16:14:49 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:20.721 16:14:49 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:20.721 16:14:49 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2254176' 00:06:20.721 killing process with pid 2254176 00:06:20.721 16:14:49 -- common/autotest_common.sh@945 -- # kill 2254176 00:06:20.721 16:14:49 -- common/autotest_common.sh@950 -- # wait 2254176 00:06:21.292 16:14:49 -- event/cpu_locks.sh@90 -- # killprocess 2254439 00:06:21.292 16:14:49 -- common/autotest_common.sh@926 -- # '[' -z 2254439 ']' 00:06:21.292 16:14:49 -- common/autotest_common.sh@930 -- # kill -0 2254439 00:06:21.292 16:14:49 -- common/autotest_common.sh@931 -- # uname 00:06:21.292 16:14:49 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:21.292 16:14:49 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2254439 00:06:21.292 16:14:50 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:21.292 16:14:50 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:21.292 16:14:50 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2254439' 00:06:21.292 killing process with pid 2254439 00:06:21.292 16:14:50 -- common/autotest_common.sh@945 -- # kill 2254439 00:06:21.292 16:14:50 -- common/autotest_common.sh@950 -- # wait 2254439 00:06:21.552 00:06:21.552 real 0m3.907s 00:06:21.552 user 0m4.143s 00:06:21.552 sys 0m1.341s 00:06:21.552 16:14:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.552 16:14:50 -- common/autotest_common.sh@10 -- # set +x 00:06:21.552 ************************************ 00:06:21.552 END TEST non_locking_app_on_locked_coremask 00:06:21.552 ************************************ 00:06:21.552 16:14:50 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:21.552 16:14:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:21.552 16:14:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:21.552 16:14:50 -- common/autotest_common.sh@10 -- # set +x 00:06:21.812 ************************************ 00:06:21.812 START TEST locking_app_on_unlocked_coremask 00:06:21.812 ************************************ 00:06:21.812 16:14:50 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:06:21.812 16:14:50 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=2255011 00:06:21.812 16:14:50 -- event/cpu_locks.sh@99 -- # waitforlisten 2255011 /var/tmp/spdk.sock 00:06:21.812 16:14:50 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:21.812 16:14:50 -- common/autotest_common.sh@819 -- # '[' -z 2255011 ']' 00:06:21.812 16:14:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:21.812 16:14:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:21.812 16:14:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:21.812 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:21.812 16:14:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:21.812 16:14:50 -- common/autotest_common.sh@10 -- # set +x 00:06:21.812 [2024-07-20 16:14:50.383135] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:21.812 [2024-07-20 16:14:50.383222] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2255011 ] 00:06:21.812 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.812 [2024-07-20 16:14:50.450464] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:21.812 [2024-07-20 16:14:50.450487] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.812 [2024-07-20 16:14:50.487618] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:21.812 [2024-07-20 16:14:50.487731] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.750 16:14:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:22.750 16:14:51 -- common/autotest_common.sh@852 -- # return 0 00:06:22.750 16:14:51 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=2255085 00:06:22.750 16:14:51 -- event/cpu_locks.sh@103 -- # waitforlisten 2255085 /var/tmp/spdk2.sock 00:06:22.750 16:14:51 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:22.750 16:14:51 -- common/autotest_common.sh@819 -- # '[' -z 2255085 ']' 00:06:22.750 16:14:51 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:22.750 16:14:51 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:22.750 16:14:51 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:22.750 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:22.750 16:14:51 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:22.750 16:14:51 -- common/autotest_common.sh@10 -- # set +x 00:06:22.750 [2024-07-20 16:14:51.217452] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:22.750 [2024-07-20 16:14:51.217519] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2255085 ] 00:06:22.750 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.750 [2024-07-20 16:14:51.310508] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.750 [2024-07-20 16:14:51.384265] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:22.750 [2024-07-20 16:14:51.384393] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.319 16:14:52 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:23.319 16:14:52 -- common/autotest_common.sh@852 -- # return 0 00:06:23.319 16:14:52 -- event/cpu_locks.sh@105 -- # locks_exist 2255085 00:06:23.319 16:14:52 -- event/cpu_locks.sh@22 -- # lslocks -p 2255085 00:06:23.319 16:14:52 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:24.256 lslocks: write error 00:06:24.256 16:14:52 -- event/cpu_locks.sh@107 -- # killprocess 2255011 00:06:24.256 16:14:52 -- common/autotest_common.sh@926 -- # '[' -z 2255011 ']' 00:06:24.256 16:14:52 -- common/autotest_common.sh@930 -- # kill -0 2255011 00:06:24.256 16:14:52 -- common/autotest_common.sh@931 -- # uname 00:06:24.256 16:14:52 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:24.256 16:14:52 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2255011 00:06:24.256 16:14:52 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:24.256 16:14:52 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:24.256 16:14:52 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2255011' 00:06:24.256 killing process with pid 2255011 00:06:24.256 16:14:52 -- common/autotest_common.sh@945 -- # kill 2255011 00:06:24.256 16:14:52 -- common/autotest_common.sh@950 -- # wait 2255011 00:06:24.823 16:14:53 -- event/cpu_locks.sh@108 -- # killprocess 2255085 00:06:24.823 16:14:53 -- common/autotest_common.sh@926 -- # '[' -z 2255085 ']' 00:06:24.823 16:14:53 -- common/autotest_common.sh@930 -- # kill -0 2255085 00:06:24.823 16:14:53 -- common/autotest_common.sh@931 -- # uname 00:06:24.823 16:14:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:24.823 16:14:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2255085 00:06:24.823 16:14:53 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:24.823 16:14:53 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:24.823 16:14:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2255085' 00:06:24.823 killing process with pid 2255085 00:06:24.823 16:14:53 -- common/autotest_common.sh@945 -- # kill 2255085 00:06:24.823 16:14:53 -- common/autotest_common.sh@950 -- # wait 2255085 00:06:25.083 00:06:25.083 real 0m3.467s 00:06:25.083 user 0m3.691s 00:06:25.083 sys 0m1.145s 00:06:25.083 16:14:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:25.083 16:14:53 -- common/autotest_common.sh@10 -- # set +x 00:06:25.083 ************************************ 00:06:25.083 END TEST locking_app_on_unlocked_coremask 00:06:25.083 ************************************ 00:06:25.083 16:14:53 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:25.083 16:14:53 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:25.083 16:14:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:25.083 16:14:53 -- common/autotest_common.sh@10 -- # set +x 00:06:25.083 ************************************ 00:06:25.083 START TEST locking_app_on_locked_coremask 00:06:25.083 ************************************ 00:06:25.083 16:14:53 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:06:25.083 16:14:53 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=2255598 00:06:25.083 16:14:53 -- event/cpu_locks.sh@116 -- # waitforlisten 2255598 /var/tmp/spdk.sock 00:06:25.083 16:14:53 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:25.083 16:14:53 -- common/autotest_common.sh@819 -- # '[' -z 2255598 ']' 00:06:25.083 16:14:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.083 16:14:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:25.083 16:14:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.083 16:14:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:25.083 16:14:53 -- common/autotest_common.sh@10 -- # set +x 00:06:25.342 [2024-07-20 16:14:53.897005] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:25.342 [2024-07-20 16:14:53.897078] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2255598 ] 00:06:25.342 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.342 [2024-07-20 16:14:53.963601] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.342 [2024-07-20 16:14:54.001350] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:25.342 [2024-07-20 16:14:54.001491] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.909 16:14:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.909 16:14:54 -- common/autotest_common.sh@852 -- # return 0 00:06:25.909 16:14:54 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=2255866 00:06:25.909 16:14:54 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 2255866 /var/tmp/spdk2.sock 00:06:25.909 16:14:54 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:25.909 16:14:54 -- common/autotest_common.sh@640 -- # local es=0 00:06:25.909 16:14:54 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2255866 /var/tmp/spdk2.sock 00:06:25.909 16:14:54 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:26.168 16:14:54 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:26.168 16:14:54 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:26.168 16:14:54 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:26.168 16:14:54 -- common/autotest_common.sh@643 -- # waitforlisten 2255866 /var/tmp/spdk2.sock 00:06:26.168 16:14:54 -- common/autotest_common.sh@819 -- # '[' -z 2255866 ']' 00:06:26.168 16:14:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:26.168 16:14:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:26.168 16:14:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:26.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:26.168 16:14:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:26.168 16:14:54 -- common/autotest_common.sh@10 -- # set +x 00:06:26.168 [2024-07-20 16:14:54.733640] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:26.168 [2024-07-20 16:14:54.733729] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2255866 ] 00:06:26.168 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.168 [2024-07-20 16:14:54.822351] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 2255598 has claimed it. 00:06:26.168 [2024-07-20 16:14:54.822385] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:26.838 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2255866) - No such process 00:06:26.838 ERROR: process (pid: 2255866) is no longer running 00:06:26.838 16:14:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:26.838 16:14:55 -- common/autotest_common.sh@852 -- # return 1 00:06:26.838 16:14:55 -- common/autotest_common.sh@643 -- # es=1 00:06:26.838 16:14:55 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:26.838 16:14:55 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:26.838 16:14:55 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:26.838 16:14:55 -- event/cpu_locks.sh@122 -- # locks_exist 2255598 00:06:26.838 16:14:55 -- event/cpu_locks.sh@22 -- # lslocks -p 2255598 00:06:26.838 16:14:55 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:27.403 lslocks: write error 00:06:27.403 16:14:56 -- event/cpu_locks.sh@124 -- # killprocess 2255598 00:06:27.403 16:14:56 -- common/autotest_common.sh@926 -- # '[' -z 2255598 ']' 00:06:27.403 16:14:56 -- common/autotest_common.sh@930 -- # kill -0 2255598 00:06:27.403 16:14:56 -- common/autotest_common.sh@931 -- # uname 00:06:27.403 16:14:56 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:27.403 16:14:56 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2255598 00:06:27.403 16:14:56 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:27.403 16:14:56 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:27.403 16:14:56 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2255598' 00:06:27.403 killing process with pid 2255598 00:06:27.403 16:14:56 -- common/autotest_common.sh@945 -- # kill 2255598 00:06:27.403 16:14:56 -- common/autotest_common.sh@950 -- # wait 2255598 00:06:27.661 00:06:27.661 real 0m2.555s 00:06:27.661 user 0m2.757s 00:06:27.661 sys 0m0.796s 00:06:27.661 16:14:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.661 16:14:56 -- common/autotest_common.sh@10 -- # set +x 00:06:27.661 ************************************ 00:06:27.661 END TEST locking_app_on_locked_coremask 00:06:27.661 ************************************ 00:06:27.919 16:14:56 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:27.919 16:14:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:27.919 16:14:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:27.919 16:14:56 -- common/autotest_common.sh@10 -- # set +x 00:06:27.919 ************************************ 00:06:27.919 START TEST locking_overlapped_coremask 00:06:27.919 ************************************ 00:06:27.919 16:14:56 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:06:27.919 16:14:56 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=2256170 00:06:27.919 16:14:56 -- event/cpu_locks.sh@133 -- # waitforlisten 2256170 /var/tmp/spdk.sock 00:06:27.919 16:14:56 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:27.919 16:14:56 -- common/autotest_common.sh@819 -- # '[' -z 2256170 ']' 00:06:27.919 16:14:56 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:27.919 16:14:56 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:27.919 16:14:56 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:27.919 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:27.919 16:14:56 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:27.919 16:14:56 -- common/autotest_common.sh@10 -- # set +x 00:06:27.919 [2024-07-20 16:14:56.502869] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:27.919 [2024-07-20 16:14:56.502960] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2256170 ] 00:06:27.919 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.919 [2024-07-20 16:14:56.570908] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:27.919 [2024-07-20 16:14:56.609047] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:27.919 [2024-07-20 16:14:56.609182] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:27.919 [2024-07-20 16:14:56.609300] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.919 [2024-07-20 16:14:56.609301] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:28.854 16:14:57 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:28.854 16:14:57 -- common/autotest_common.sh@852 -- # return 0 00:06:28.854 16:14:57 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=2256220 00:06:28.854 16:14:57 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 2256220 /var/tmp/spdk2.sock 00:06:28.854 16:14:57 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:28.854 16:14:57 -- common/autotest_common.sh@640 -- # local es=0 00:06:28.854 16:14:57 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2256220 /var/tmp/spdk2.sock 00:06:28.854 16:14:57 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:28.854 16:14:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:28.854 16:14:57 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:28.854 16:14:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:28.854 16:14:57 -- common/autotest_common.sh@643 -- # waitforlisten 2256220 /var/tmp/spdk2.sock 00:06:28.854 16:14:57 -- common/autotest_common.sh@819 -- # '[' -z 2256220 ']' 00:06:28.854 16:14:57 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:28.854 16:14:57 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:28.854 16:14:57 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:28.854 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:28.854 16:14:57 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:28.854 16:14:57 -- common/autotest_common.sh@10 -- # set +x 00:06:28.854 [2024-07-20 16:14:57.350490] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:28.854 [2024-07-20 16:14:57.350580] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2256220 ] 00:06:28.854 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.854 [2024-07-20 16:14:57.461028] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2256170 has claimed it. 00:06:28.854 [2024-07-20 16:14:57.461069] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:29.421 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2256220) - No such process 00:06:29.421 ERROR: process (pid: 2256220) is no longer running 00:06:29.421 16:14:57 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:29.421 16:14:57 -- common/autotest_common.sh@852 -- # return 1 00:06:29.421 16:14:57 -- common/autotest_common.sh@643 -- # es=1 00:06:29.421 16:14:57 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:29.421 16:14:57 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:29.421 16:14:57 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:29.421 16:14:57 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:29.421 16:14:57 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:29.421 16:14:57 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:29.421 16:14:57 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:29.421 16:14:57 -- event/cpu_locks.sh@141 -- # killprocess 2256170 00:06:29.421 16:14:57 -- common/autotest_common.sh@926 -- # '[' -z 2256170 ']' 00:06:29.421 16:14:57 -- common/autotest_common.sh@930 -- # kill -0 2256170 00:06:29.421 16:14:58 -- common/autotest_common.sh@931 -- # uname 00:06:29.421 16:14:58 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:29.421 16:14:58 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2256170 00:06:29.421 16:14:58 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:29.421 16:14:58 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:29.421 16:14:58 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2256170' 00:06:29.421 killing process with pid 2256170 00:06:29.421 16:14:58 -- common/autotest_common.sh@945 -- # kill 2256170 00:06:29.421 16:14:58 -- common/autotest_common.sh@950 -- # wait 2256170 00:06:29.680 00:06:29.680 real 0m1.873s 00:06:29.680 user 0m5.360s 00:06:29.680 sys 0m0.465s 00:06:29.680 16:14:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.680 16:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:29.680 ************************************ 00:06:29.680 END TEST locking_overlapped_coremask 00:06:29.680 ************************************ 00:06:29.680 16:14:58 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:29.680 16:14:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:29.680 16:14:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:29.680 16:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:29.680 ************************************ 00:06:29.680 START TEST locking_overlapped_coremask_via_rpc 00:06:29.680 ************************************ 00:06:29.680 16:14:58 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:06:29.680 16:14:58 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=2256478 00:06:29.680 16:14:58 -- event/cpu_locks.sh@149 -- # waitforlisten 2256478 /var/tmp/spdk.sock 00:06:29.680 16:14:58 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:29.680 16:14:58 -- common/autotest_common.sh@819 -- # '[' -z 2256478 ']' 00:06:29.680 16:14:58 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:29.680 16:14:58 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:29.680 16:14:58 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:29.680 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:29.680 16:14:58 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:29.680 16:14:58 -- common/autotest_common.sh@10 -- # set +x 00:06:29.680 [2024-07-20 16:14:58.424678] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:29.680 [2024-07-20 16:14:58.424773] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2256478 ] 00:06:29.680 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.939 [2024-07-20 16:14:58.492537] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:29.939 [2024-07-20 16:14:58.492565] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:29.939 [2024-07-20 16:14:58.526658] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:29.939 [2024-07-20 16:14:58.526860] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:29.939 [2024-07-20 16:14:58.526956] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.939 [2024-07-20 16:14:58.526957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:30.507 16:14:59 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:30.507 16:14:59 -- common/autotest_common.sh@852 -- # return 0 00:06:30.507 16:14:59 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=2256707 00:06:30.507 16:14:59 -- event/cpu_locks.sh@153 -- # waitforlisten 2256707 /var/tmp/spdk2.sock 00:06:30.507 16:14:59 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:30.507 16:14:59 -- common/autotest_common.sh@819 -- # '[' -z 2256707 ']' 00:06:30.507 16:14:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:30.507 16:14:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:30.507 16:14:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:30.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:30.507 16:14:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:30.507 16:14:59 -- common/autotest_common.sh@10 -- # set +x 00:06:30.507 [2024-07-20 16:14:59.252779] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:30.507 [2024-07-20 16:14:59.252867] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2256707 ] 00:06:30.507 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.766 [2024-07-20 16:14:59.346431] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:30.766 [2024-07-20 16:14:59.350466] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:30.766 [2024-07-20 16:14:59.420493] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:30.766 [2024-07-20 16:14:59.420702] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:30.766 [2024-07-20 16:14:59.424496] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:30.766 [2024-07-20 16:14:59.424497] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:31.333 16:15:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:31.333 16:15:00 -- common/autotest_common.sh@852 -- # return 0 00:06:31.333 16:15:00 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:31.333 16:15:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:31.333 16:15:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.333 16:15:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:31.333 16:15:00 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:31.333 16:15:00 -- common/autotest_common.sh@640 -- # local es=0 00:06:31.333 16:15:00 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:31.333 16:15:00 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:06:31.333 16:15:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.333 16:15:00 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:06:31.333 16:15:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.333 16:15:00 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:31.333 16:15:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:31.333 16:15:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.333 [2024-07-20 16:15:00.076505] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2256478 has claimed it. 00:06:31.333 request: 00:06:31.333 { 00:06:31.333 "method": "framework_enable_cpumask_locks", 00:06:31.333 "req_id": 1 00:06:31.333 } 00:06:31.333 Got JSON-RPC error response 00:06:31.333 response: 00:06:31.333 { 00:06:31.333 "code": -32603, 00:06:31.333 "message": "Failed to claim CPU core: 2" 00:06:31.333 } 00:06:31.333 16:15:00 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:06:31.333 16:15:00 -- common/autotest_common.sh@643 -- # es=1 00:06:31.333 16:15:00 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:31.333 16:15:00 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:31.333 16:15:00 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:31.333 16:15:00 -- event/cpu_locks.sh@158 -- # waitforlisten 2256478 /var/tmp/spdk.sock 00:06:31.333 16:15:00 -- common/autotest_common.sh@819 -- # '[' -z 2256478 ']' 00:06:31.333 16:15:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.333 16:15:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:31.333 16:15:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.333 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.333 16:15:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:31.333 16:15:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.590 16:15:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:31.590 16:15:00 -- common/autotest_common.sh@852 -- # return 0 00:06:31.590 16:15:00 -- event/cpu_locks.sh@159 -- # waitforlisten 2256707 /var/tmp/spdk2.sock 00:06:31.590 16:15:00 -- common/autotest_common.sh@819 -- # '[' -z 2256707 ']' 00:06:31.590 16:15:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:31.590 16:15:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:31.590 16:15:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:31.590 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:31.590 16:15:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:31.590 16:15:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.848 16:15:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:31.848 16:15:00 -- common/autotest_common.sh@852 -- # return 0 00:06:31.848 16:15:00 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:31.848 16:15:00 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:31.848 16:15:00 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:31.848 16:15:00 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:31.848 00:06:31.848 real 0m2.043s 00:06:31.848 user 0m0.780s 00:06:31.848 sys 0m0.202s 00:06:31.848 16:15:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.848 16:15:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.848 ************************************ 00:06:31.848 END TEST locking_overlapped_coremask_via_rpc 00:06:31.848 ************************************ 00:06:31.848 16:15:00 -- event/cpu_locks.sh@174 -- # cleanup 00:06:31.848 16:15:00 -- event/cpu_locks.sh@15 -- # [[ -z 2256478 ]] 00:06:31.848 16:15:00 -- event/cpu_locks.sh@15 -- # killprocess 2256478 00:06:31.848 16:15:00 -- common/autotest_common.sh@926 -- # '[' -z 2256478 ']' 00:06:31.848 16:15:00 -- common/autotest_common.sh@930 -- # kill -0 2256478 00:06:31.848 16:15:00 -- common/autotest_common.sh@931 -- # uname 00:06:31.848 16:15:00 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:31.848 16:15:00 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2256478 00:06:31.848 16:15:00 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:31.848 16:15:00 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:31.848 16:15:00 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2256478' 00:06:31.848 killing process with pid 2256478 00:06:31.848 16:15:00 -- common/autotest_common.sh@945 -- # kill 2256478 00:06:31.848 16:15:00 -- common/autotest_common.sh@950 -- # wait 2256478 00:06:32.107 16:15:00 -- event/cpu_locks.sh@16 -- # [[ -z 2256707 ]] 00:06:32.107 16:15:00 -- event/cpu_locks.sh@16 -- # killprocess 2256707 00:06:32.107 16:15:00 -- common/autotest_common.sh@926 -- # '[' -z 2256707 ']' 00:06:32.107 16:15:00 -- common/autotest_common.sh@930 -- # kill -0 2256707 00:06:32.107 16:15:00 -- common/autotest_common.sh@931 -- # uname 00:06:32.107 16:15:00 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:32.107 16:15:00 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2256707 00:06:32.107 16:15:00 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:06:32.107 16:15:00 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:06:32.107 16:15:00 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2256707' 00:06:32.107 killing process with pid 2256707 00:06:32.107 16:15:00 -- common/autotest_common.sh@945 -- # kill 2256707 00:06:32.107 16:15:00 -- common/autotest_common.sh@950 -- # wait 2256707 00:06:32.672 16:15:01 -- event/cpu_locks.sh@18 -- # rm -f 00:06:32.672 16:15:01 -- event/cpu_locks.sh@1 -- # cleanup 00:06:32.672 16:15:01 -- event/cpu_locks.sh@15 -- # [[ -z 2256478 ]] 00:06:32.672 16:15:01 -- event/cpu_locks.sh@15 -- # killprocess 2256478 00:06:32.672 16:15:01 -- common/autotest_common.sh@926 -- # '[' -z 2256478 ']' 00:06:32.672 16:15:01 -- common/autotest_common.sh@930 -- # kill -0 2256478 00:06:32.672 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (2256478) - No such process 00:06:32.672 16:15:01 -- common/autotest_common.sh@953 -- # echo 'Process with pid 2256478 is not found' 00:06:32.672 Process with pid 2256478 is not found 00:06:32.672 16:15:01 -- event/cpu_locks.sh@16 -- # [[ -z 2256707 ]] 00:06:32.672 16:15:01 -- event/cpu_locks.sh@16 -- # killprocess 2256707 00:06:32.672 16:15:01 -- common/autotest_common.sh@926 -- # '[' -z 2256707 ']' 00:06:32.672 16:15:01 -- common/autotest_common.sh@930 -- # kill -0 2256707 00:06:32.672 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (2256707) - No such process 00:06:32.672 16:15:01 -- common/autotest_common.sh@953 -- # echo 'Process with pid 2256707 is not found' 00:06:32.672 Process with pid 2256707 is not found 00:06:32.672 16:15:01 -- event/cpu_locks.sh@18 -- # rm -f 00:06:32.672 00:06:32.672 real 0m18.128s 00:06:32.672 user 0m30.427s 00:06:32.672 sys 0m5.964s 00:06:32.672 16:15:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.672 16:15:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.672 ************************************ 00:06:32.672 END TEST cpu_locks 00:06:32.672 ************************************ 00:06:32.672 00:06:32.673 real 0m42.563s 00:06:32.673 user 1m18.787s 00:06:32.673 sys 0m10.034s 00:06:32.673 16:15:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.673 16:15:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.673 ************************************ 00:06:32.673 END TEST event 00:06:32.673 ************************************ 00:06:32.673 16:15:01 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:32.673 16:15:01 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:32.673 16:15:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:32.673 16:15:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.673 ************************************ 00:06:32.673 START TEST thread 00:06:32.673 ************************************ 00:06:32.673 16:15:01 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:32.673 * Looking for test storage... 00:06:32.673 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:32.673 16:15:01 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:32.673 16:15:01 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:32.673 16:15:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:32.673 16:15:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.673 ************************************ 00:06:32.673 START TEST thread_poller_perf 00:06:32.673 ************************************ 00:06:32.673 16:15:01 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:32.673 [2024-07-20 16:15:01.392223] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:32.673 [2024-07-20 16:15:01.392334] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2257213 ] 00:06:32.673 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.673 [2024-07-20 16:15:01.463990] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.930 [2024-07-20 16:15:01.501156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.930 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:33.863 ====================================== 00:06:33.863 busy:2504264252 (cyc) 00:06:33.863 total_run_count: 800000 00:06:33.863 tsc_hz: 2500000000 (cyc) 00:06:33.863 ====================================== 00:06:33.863 poller_cost: 3130 (cyc), 1252 (nsec) 00:06:33.863 00:06:33.863 real 0m1.184s 00:06:33.863 user 0m1.081s 00:06:33.863 sys 0m0.097s 00:06:33.863 16:15:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.863 16:15:02 -- common/autotest_common.sh@10 -- # set +x 00:06:33.863 ************************************ 00:06:33.863 END TEST thread_poller_perf 00:06:33.863 ************************************ 00:06:33.863 16:15:02 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:33.863 16:15:02 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:33.863 16:15:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:33.863 16:15:02 -- common/autotest_common.sh@10 -- # set +x 00:06:33.863 ************************************ 00:06:33.863 START TEST thread_poller_perf 00:06:33.863 ************************************ 00:06:33.863 16:15:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:33.863 [2024-07-20 16:15:02.604810] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:33.863 [2024-07-20 16:15:02.604874] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2257525 ] 00:06:33.863 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.121 [2024-07-20 16:15:02.668620] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.121 [2024-07-20 16:15:02.704052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.121 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:35.052 ====================================== 00:06:35.052 busy:2502100238 (cyc) 00:06:35.052 total_run_count: 13527000 00:06:35.052 tsc_hz: 2500000000 (cyc) 00:06:35.052 ====================================== 00:06:35.052 poller_cost: 184 (cyc), 73 (nsec) 00:06:35.052 00:06:35.052 real 0m1.162s 00:06:35.052 user 0m1.083s 00:06:35.052 sys 0m0.075s 00:06:35.052 16:15:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.052 16:15:03 -- common/autotest_common.sh@10 -- # set +x 00:06:35.052 ************************************ 00:06:35.052 END TEST thread_poller_perf 00:06:35.052 ************************************ 00:06:35.052 16:15:03 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:35.052 16:15:03 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:35.052 16:15:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:35.052 16:15:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:35.052 16:15:03 -- common/autotest_common.sh@10 -- # set +x 00:06:35.052 ************************************ 00:06:35.052 START TEST thread_spdk_lock 00:06:35.052 ************************************ 00:06:35.052 16:15:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:35.052 [2024-07-20 16:15:03.818138] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:35.052 [2024-07-20 16:15:03.818240] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2257748 ] 00:06:35.052 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.309 [2024-07-20 16:15:03.888748] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:35.309 [2024-07-20 16:15:03.925648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:35.309 [2024-07-20 16:15:03.925651] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.874 [2024-07-20 16:15:04.415969] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.874 [2024-07-20 16:15:04.416014] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:35.874 [2024-07-20 16:15:04.416027] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x12f6280 00:06:35.874 [2024-07-20 16:15:04.416810] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.874 [2024-07-20 16:15:04.416915] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.874 [2024-07-20 16:15:04.416935] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.874 Starting test contend 00:06:35.874 Worker Delay Wait us Hold us Total us 00:06:35.874 0 3 178477 186009 364487 00:06:35.874 1 5 90520 287185 377705 00:06:35.874 PASS test contend 00:06:35.874 Starting test hold_by_poller 00:06:35.874 PASS test hold_by_poller 00:06:35.874 Starting test hold_by_message 00:06:35.874 PASS test hold_by_message 00:06:35.874 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:35.874 100014 assertions passed 00:06:35.874 0 assertions failed 00:06:35.874 00:06:35.874 real 0m0.666s 00:06:35.874 user 0m1.067s 00:06:35.874 sys 0m0.087s 00:06:35.874 16:15:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.874 16:15:04 -- common/autotest_common.sh@10 -- # set +x 00:06:35.874 ************************************ 00:06:35.874 END TEST thread_spdk_lock 00:06:35.874 ************************************ 00:06:35.874 00:06:35.874 real 0m3.223s 00:06:35.874 user 0m3.315s 00:06:35.874 sys 0m0.416s 00:06:35.874 16:15:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.874 16:15:04 -- common/autotest_common.sh@10 -- # set +x 00:06:35.874 ************************************ 00:06:35.874 END TEST thread 00:06:35.874 ************************************ 00:06:35.874 16:15:04 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:35.874 16:15:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:35.874 16:15:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:35.874 16:15:04 -- common/autotest_common.sh@10 -- # set +x 00:06:35.874 ************************************ 00:06:35.874 START TEST accel 00:06:35.874 ************************************ 00:06:35.874 16:15:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:35.874 * Looking for test storage... 00:06:35.874 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:35.874 16:15:04 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:35.874 16:15:04 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:35.874 16:15:04 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:35.874 16:15:04 -- accel/accel.sh@59 -- # spdk_tgt_pid=2258073 00:06:35.874 16:15:04 -- accel/accel.sh@60 -- # waitforlisten 2258073 00:06:35.874 16:15:04 -- common/autotest_common.sh@819 -- # '[' -z 2258073 ']' 00:06:35.874 16:15:04 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.874 16:15:04 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:35.874 16:15:04 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:35.874 16:15:04 -- accel/accel.sh@58 -- # build_accel_config 00:06:35.874 16:15:04 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.874 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.874 16:15:04 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:35.874 16:15:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.874 16:15:04 -- common/autotest_common.sh@10 -- # set +x 00:06:35.874 16:15:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.874 16:15:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.874 16:15:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.874 16:15:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.874 16:15:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.874 16:15:04 -- accel/accel.sh@42 -- # jq -r . 00:06:35.874 [2024-07-20 16:15:04.675291] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:35.874 [2024-07-20 16:15:04.675379] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2258073 ] 00:06:36.132 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.132 [2024-07-20 16:15:04.743299] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.132 [2024-07-20 16:15:04.780314] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:36.132 [2024-07-20 16:15:04.780430] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.699 16:15:05 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:36.699 16:15:05 -- common/autotest_common.sh@852 -- # return 0 00:06:36.699 16:15:05 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:36.699 16:15:05 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:36.699 16:15:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:36.699 16:15:05 -- common/autotest_common.sh@10 -- # set +x 00:06:36.699 16:15:05 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:36.958 16:15:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # IFS== 00:06:36.958 16:15:05 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.958 16:15:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.958 16:15:05 -- accel/accel.sh@67 -- # killprocess 2258073 00:06:36.958 16:15:05 -- common/autotest_common.sh@926 -- # '[' -z 2258073 ']' 00:06:36.958 16:15:05 -- common/autotest_common.sh@930 -- # kill -0 2258073 00:06:36.958 16:15:05 -- common/autotest_common.sh@931 -- # uname 00:06:36.958 16:15:05 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:36.958 16:15:05 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2258073 00:06:36.958 16:15:05 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:36.958 16:15:05 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:36.958 16:15:05 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2258073' 00:06:36.958 killing process with pid 2258073 00:06:36.958 16:15:05 -- common/autotest_common.sh@945 -- # kill 2258073 00:06:36.958 16:15:05 -- common/autotest_common.sh@950 -- # wait 2258073 00:06:37.215 16:15:05 -- accel/accel.sh@68 -- # trap - ERR 00:06:37.215 16:15:05 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:37.215 16:15:05 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:37.215 16:15:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.215 16:15:05 -- common/autotest_common.sh@10 -- # set +x 00:06:37.215 16:15:05 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:37.215 16:15:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:37.215 16:15:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.215 16:15:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.215 16:15:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.215 16:15:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.215 16:15:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.215 16:15:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.215 16:15:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.215 16:15:05 -- accel/accel.sh@42 -- # jq -r . 00:06:37.215 16:15:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.215 16:15:05 -- common/autotest_common.sh@10 -- # set +x 00:06:37.215 16:15:05 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:37.215 16:15:05 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:37.215 16:15:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.215 16:15:05 -- common/autotest_common.sh@10 -- # set +x 00:06:37.215 ************************************ 00:06:37.215 START TEST accel_missing_filename 00:06:37.215 ************************************ 00:06:37.215 16:15:05 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:37.215 16:15:05 -- common/autotest_common.sh@640 -- # local es=0 00:06:37.215 16:15:05 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:37.215 16:15:05 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:37.215 16:15:05 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.215 16:15:05 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:37.215 16:15:05 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.216 16:15:05 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:37.216 16:15:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:37.216 16:15:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.216 16:15:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.216 16:15:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.216 16:15:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.216 16:15:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.216 16:15:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.216 16:15:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.216 16:15:05 -- accel/accel.sh@42 -- # jq -r . 00:06:37.216 [2024-07-20 16:15:05.970674] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:37.216 [2024-07-20 16:15:05.970766] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2258613 ] 00:06:37.216 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.473 [2024-07-20 16:15:06.039702] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.473 [2024-07-20 16:15:06.075332] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.473 [2024-07-20 16:15:06.113966] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:37.473 [2024-07-20 16:15:06.173479] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:37.473 A filename is required. 00:06:37.473 16:15:06 -- common/autotest_common.sh@643 -- # es=234 00:06:37.473 16:15:06 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:37.473 16:15:06 -- common/autotest_common.sh@652 -- # es=106 00:06:37.474 16:15:06 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:37.474 16:15:06 -- common/autotest_common.sh@660 -- # es=1 00:06:37.474 16:15:06 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:37.474 00:06:37.474 real 0m0.283s 00:06:37.474 user 0m0.192s 00:06:37.474 sys 0m0.133s 00:06:37.474 16:15:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.474 16:15:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.474 ************************************ 00:06:37.474 END TEST accel_missing_filename 00:06:37.474 ************************************ 00:06:37.474 16:15:06 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.474 16:15:06 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:37.474 16:15:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.474 16:15:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.732 ************************************ 00:06:37.732 START TEST accel_compress_verify 00:06:37.732 ************************************ 00:06:37.732 16:15:06 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.732 16:15:06 -- common/autotest_common.sh@640 -- # local es=0 00:06:37.732 16:15:06 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.732 16:15:06 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:37.732 16:15:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.732 16:15:06 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:37.732 16:15:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.732 16:15:06 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.732 16:15:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.732 16:15:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.732 16:15:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.732 16:15:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.732 16:15:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.732 16:15:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.732 16:15:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.732 16:15:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.732 16:15:06 -- accel/accel.sh@42 -- # jq -r . 00:06:37.732 [2024-07-20 16:15:06.299579] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:37.732 [2024-07-20 16:15:06.299702] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2258639 ] 00:06:37.732 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.732 [2024-07-20 16:15:06.369258] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.732 [2024-07-20 16:15:06.404701] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.732 [2024-07-20 16:15:06.444312] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:37.732 [2024-07-20 16:15:06.503549] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:37.991 00:06:37.991 Compression does not support the verify option, aborting. 00:06:37.991 16:15:06 -- common/autotest_common.sh@643 -- # es=161 00:06:37.991 16:15:06 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:37.991 16:15:06 -- common/autotest_common.sh@652 -- # es=33 00:06:37.991 16:15:06 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:37.991 16:15:06 -- common/autotest_common.sh@660 -- # es=1 00:06:37.991 16:15:06 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:37.991 00:06:37.991 real 0m0.285s 00:06:37.991 user 0m0.195s 00:06:37.991 sys 0m0.130s 00:06:37.991 16:15:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.991 16:15:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.991 ************************************ 00:06:37.991 END TEST accel_compress_verify 00:06:37.991 ************************************ 00:06:37.991 16:15:06 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:37.991 16:15:06 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:37.991 16:15:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.991 16:15:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.991 ************************************ 00:06:37.991 START TEST accel_wrong_workload 00:06:37.991 ************************************ 00:06:37.991 16:15:06 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:37.991 16:15:06 -- common/autotest_common.sh@640 -- # local es=0 00:06:37.991 16:15:06 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:37.991 16:15:06 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:37.991 16:15:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.991 16:15:06 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:37.991 16:15:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.991 16:15:06 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:37.991 16:15:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:37.991 16:15:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.991 16:15:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.991 16:15:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.991 16:15:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.991 16:15:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.991 16:15:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.991 16:15:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.991 16:15:06 -- accel/accel.sh@42 -- # jq -r . 00:06:37.991 Unsupported workload type: foobar 00:06:37.991 [2024-07-20 16:15:06.631487] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:37.991 accel_perf options: 00:06:37.991 [-h help message] 00:06:37.991 [-q queue depth per core] 00:06:37.991 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:37.991 [-T number of threads per core 00:06:37.991 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:37.991 [-t time in seconds] 00:06:37.991 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:37.991 [ dif_verify, , dif_generate, dif_generate_copy 00:06:37.991 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:37.991 [-l for compress/decompress workloads, name of uncompressed input file 00:06:37.991 [-S for crc32c workload, use this seed value (default 0) 00:06:37.991 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:37.991 [-f for fill workload, use this BYTE value (default 255) 00:06:37.991 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:37.991 [-y verify result if this switch is on] 00:06:37.991 [-a tasks to allocate per core (default: same value as -q)] 00:06:37.991 Can be used to spread operations across a wider range of memory. 00:06:37.991 16:15:06 -- common/autotest_common.sh@643 -- # es=1 00:06:37.991 16:15:06 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:37.991 16:15:06 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:37.991 16:15:06 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:37.991 00:06:37.991 real 0m0.027s 00:06:37.991 user 0m0.013s 00:06:37.991 sys 0m0.014s 00:06:37.991 16:15:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.991 16:15:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.991 ************************************ 00:06:37.991 END TEST accel_wrong_workload 00:06:37.991 ************************************ 00:06:37.991 Error: writing output failed: Broken pipe 00:06:37.991 16:15:06 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:37.991 16:15:06 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:37.991 16:15:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.991 16:15:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.991 ************************************ 00:06:37.991 START TEST accel_negative_buffers 00:06:37.991 ************************************ 00:06:37.991 16:15:06 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:37.991 16:15:06 -- common/autotest_common.sh@640 -- # local es=0 00:06:37.991 16:15:06 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:37.991 16:15:06 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:37.992 16:15:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.992 16:15:06 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:37.992 16:15:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.992 16:15:06 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:37.992 16:15:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:37.992 16:15:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.992 16:15:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.992 16:15:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.992 16:15:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.992 16:15:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.992 16:15:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.992 16:15:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.992 16:15:06 -- accel/accel.sh@42 -- # jq -r . 00:06:37.992 -x option must be non-negative. 00:06:37.992 [2024-07-20 16:15:06.705249] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:37.992 accel_perf options: 00:06:37.992 [-h help message] 00:06:37.992 [-q queue depth per core] 00:06:37.992 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:37.992 [-T number of threads per core 00:06:37.992 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:37.992 [-t time in seconds] 00:06:37.992 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:37.992 [ dif_verify, , dif_generate, dif_generate_copy 00:06:37.992 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:37.992 [-l for compress/decompress workloads, name of uncompressed input file 00:06:37.992 [-S for crc32c workload, use this seed value (default 0) 00:06:37.992 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:37.992 [-f for fill workload, use this BYTE value (default 255) 00:06:37.992 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:37.992 [-y verify result if this switch is on] 00:06:37.992 [-a tasks to allocate per core (default: same value as -q)] 00:06:37.992 Can be used to spread operations across a wider range of memory. 00:06:37.992 16:15:06 -- common/autotest_common.sh@643 -- # es=1 00:06:37.992 16:15:06 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:37.992 16:15:06 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:37.992 16:15:06 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:37.992 00:06:37.992 real 0m0.028s 00:06:37.992 user 0m0.009s 00:06:37.992 sys 0m0.019s 00:06:37.992 16:15:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.992 16:15:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.992 ************************************ 00:06:37.992 END TEST accel_negative_buffers 00:06:37.992 ************************************ 00:06:37.992 Error: writing output failed: Broken pipe 00:06:37.992 16:15:06 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:37.992 16:15:06 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:37.992 16:15:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.992 16:15:06 -- common/autotest_common.sh@10 -- # set +x 00:06:37.992 ************************************ 00:06:37.992 START TEST accel_crc32c 00:06:37.992 ************************************ 00:06:37.992 16:15:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:37.992 16:15:06 -- accel/accel.sh@16 -- # local accel_opc 00:06:37.992 16:15:06 -- accel/accel.sh@17 -- # local accel_module 00:06:37.992 16:15:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:37.992 16:15:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:37.992 16:15:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.992 16:15:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.992 16:15:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.992 16:15:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.992 16:15:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.992 16:15:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.992 16:15:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.992 16:15:06 -- accel/accel.sh@42 -- # jq -r . 00:06:37.992 [2024-07-20 16:15:06.779005] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:37.992 [2024-07-20 16:15:06.779087] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2258828 ] 00:06:38.250 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.250 [2024-07-20 16:15:06.849597] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.250 [2024-07-20 16:15:06.886141] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.626 16:15:08 -- accel/accel.sh@18 -- # out=' 00:06:39.626 SPDK Configuration: 00:06:39.626 Core mask: 0x1 00:06:39.626 00:06:39.626 Accel Perf Configuration: 00:06:39.626 Workload Type: crc32c 00:06:39.626 CRC-32C seed: 32 00:06:39.626 Transfer size: 4096 bytes 00:06:39.626 Vector count 1 00:06:39.626 Module: software 00:06:39.626 Queue depth: 32 00:06:39.626 Allocate depth: 32 00:06:39.626 # threads/core: 1 00:06:39.626 Run time: 1 seconds 00:06:39.626 Verify: Yes 00:06:39.626 00:06:39.626 Running for 1 seconds... 00:06:39.626 00:06:39.626 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:39.626 ------------------------------------------------------------------------------------ 00:06:39.626 0,0 860960/s 3363 MiB/s 0 0 00:06:39.626 ==================================================================================== 00:06:39.626 Total 860960/s 3363 MiB/s 0 0' 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.626 16:15:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:39.626 16:15:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:39.626 16:15:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.626 16:15:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.626 16:15:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.626 16:15:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.626 16:15:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.626 16:15:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.626 16:15:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.626 16:15:08 -- accel/accel.sh@42 -- # jq -r . 00:06:39.626 [2024-07-20 16:15:08.069572] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:39.626 [2024-07-20 16:15:08.069666] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2258980 ] 00:06:39.626 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.626 [2024-07-20 16:15:08.140641] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.626 [2024-07-20 16:15:08.175723] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.626 16:15:08 -- accel/accel.sh@21 -- # val= 00:06:39.626 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.626 16:15:08 -- accel/accel.sh@21 -- # val= 00:06:39.626 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.626 16:15:08 -- accel/accel.sh@21 -- # val=0x1 00:06:39.626 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.626 16:15:08 -- accel/accel.sh@21 -- # val= 00:06:39.626 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.626 16:15:08 -- accel/accel.sh@21 -- # val= 00:06:39.626 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.626 16:15:08 -- accel/accel.sh@21 -- # val=crc32c 00:06:39.626 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.626 16:15:08 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.626 16:15:08 -- accel/accel.sh@21 -- # val=32 00:06:39.626 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.626 16:15:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.626 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.626 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val= 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val=software 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@23 -- # accel_module=software 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val=32 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val=32 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val=1 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val=Yes 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val= 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:39.627 16:15:08 -- accel/accel.sh@21 -- # val= 00:06:39.627 16:15:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # IFS=: 00:06:39.627 16:15:08 -- accel/accel.sh@20 -- # read -r var val 00:06:40.563 16:15:09 -- accel/accel.sh@21 -- # val= 00:06:40.563 16:15:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.563 16:15:09 -- accel/accel.sh@21 -- # val= 00:06:40.563 16:15:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.563 16:15:09 -- accel/accel.sh@21 -- # val= 00:06:40.563 16:15:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.563 16:15:09 -- accel/accel.sh@21 -- # val= 00:06:40.563 16:15:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.563 16:15:09 -- accel/accel.sh@21 -- # val= 00:06:40.563 16:15:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.563 16:15:09 -- accel/accel.sh@21 -- # val= 00:06:40.563 16:15:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # IFS=: 00:06:40.563 16:15:09 -- accel/accel.sh@20 -- # read -r var val 00:06:40.563 16:15:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:40.563 16:15:09 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:40.563 16:15:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.563 00:06:40.563 real 0m2.584s 00:06:40.563 user 0m2.326s 00:06:40.563 sys 0m0.268s 00:06:40.563 16:15:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.563 16:15:09 -- common/autotest_common.sh@10 -- # set +x 00:06:40.563 ************************************ 00:06:40.563 END TEST accel_crc32c 00:06:40.563 ************************************ 00:06:40.822 16:15:09 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:40.822 16:15:09 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:40.822 16:15:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:40.822 16:15:09 -- common/autotest_common.sh@10 -- # set +x 00:06:40.822 ************************************ 00:06:40.822 START TEST accel_crc32c_C2 00:06:40.822 ************************************ 00:06:40.822 16:15:09 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:40.822 16:15:09 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.822 16:15:09 -- accel/accel.sh@17 -- # local accel_module 00:06:40.822 16:15:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:40.822 16:15:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:40.822 16:15:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.822 16:15:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.822 16:15:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.822 16:15:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.822 16:15:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.822 16:15:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.822 16:15:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.822 16:15:09 -- accel/accel.sh@42 -- # jq -r . 00:06:40.822 [2024-07-20 16:15:09.411213] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:40.822 [2024-07-20 16:15:09.411300] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2259257 ] 00:06:40.822 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.822 [2024-07-20 16:15:09.481396] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.822 [2024-07-20 16:15:09.517321] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.196 16:15:10 -- accel/accel.sh@18 -- # out=' 00:06:42.196 SPDK Configuration: 00:06:42.196 Core mask: 0x1 00:06:42.196 00:06:42.196 Accel Perf Configuration: 00:06:42.196 Workload Type: crc32c 00:06:42.196 CRC-32C seed: 0 00:06:42.196 Transfer size: 4096 bytes 00:06:42.196 Vector count 2 00:06:42.196 Module: software 00:06:42.196 Queue depth: 32 00:06:42.196 Allocate depth: 32 00:06:42.196 # threads/core: 1 00:06:42.196 Run time: 1 seconds 00:06:42.196 Verify: Yes 00:06:42.196 00:06:42.196 Running for 1 seconds... 00:06:42.196 00:06:42.196 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:42.196 ------------------------------------------------------------------------------------ 00:06:42.196 0,0 601824/s 4701 MiB/s 0 0 00:06:42.196 ==================================================================================== 00:06:42.196 Total 601824/s 2350 MiB/s 0 0' 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.196 16:15:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:42.196 16:15:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:42.196 16:15:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.196 16:15:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.196 16:15:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.196 16:15:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.196 16:15:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.196 16:15:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.196 16:15:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.196 16:15:10 -- accel/accel.sh@42 -- # jq -r . 00:06:42.196 [2024-07-20 16:15:10.698895] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:42.196 [2024-07-20 16:15:10.698986] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2259523 ] 00:06:42.196 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.196 [2024-07-20 16:15:10.767544] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.196 [2024-07-20 16:15:10.801919] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.196 16:15:10 -- accel/accel.sh@21 -- # val= 00:06:42.196 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.196 16:15:10 -- accel/accel.sh@21 -- # val= 00:06:42.196 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.196 16:15:10 -- accel/accel.sh@21 -- # val=0x1 00:06:42.196 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.196 16:15:10 -- accel/accel.sh@21 -- # val= 00:06:42.196 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.196 16:15:10 -- accel/accel.sh@21 -- # val= 00:06:42.196 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.196 16:15:10 -- accel/accel.sh@21 -- # val=crc32c 00:06:42.196 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.196 16:15:10 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.196 16:15:10 -- accel/accel.sh@21 -- # val=0 00:06:42.196 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.196 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.196 16:15:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val= 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val=software 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@23 -- # accel_module=software 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val=32 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val=32 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val=1 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val=Yes 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val= 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:42.197 16:15:10 -- accel/accel.sh@21 -- # val= 00:06:42.197 16:15:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # IFS=: 00:06:42.197 16:15:10 -- accel/accel.sh@20 -- # read -r var val 00:06:43.573 16:15:11 -- accel/accel.sh@21 -- # val= 00:06:43.573 16:15:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.573 16:15:11 -- accel/accel.sh@20 -- # IFS=: 00:06:43.573 16:15:11 -- accel/accel.sh@20 -- # read -r var val 00:06:43.573 16:15:11 -- accel/accel.sh@21 -- # val= 00:06:43.573 16:15:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.573 16:15:11 -- accel/accel.sh@20 -- # IFS=: 00:06:43.573 16:15:11 -- accel/accel.sh@20 -- # read -r var val 00:06:43.573 16:15:11 -- accel/accel.sh@21 -- # val= 00:06:43.573 16:15:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.573 16:15:11 -- accel/accel.sh@20 -- # IFS=: 00:06:43.574 16:15:11 -- accel/accel.sh@20 -- # read -r var val 00:06:43.574 16:15:11 -- accel/accel.sh@21 -- # val= 00:06:43.574 16:15:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.574 16:15:11 -- accel/accel.sh@20 -- # IFS=: 00:06:43.574 16:15:11 -- accel/accel.sh@20 -- # read -r var val 00:06:43.574 16:15:11 -- accel/accel.sh@21 -- # val= 00:06:43.574 16:15:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.574 16:15:11 -- accel/accel.sh@20 -- # IFS=: 00:06:43.574 16:15:11 -- accel/accel.sh@20 -- # read -r var val 00:06:43.574 16:15:11 -- accel/accel.sh@21 -- # val= 00:06:43.574 16:15:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.574 16:15:11 -- accel/accel.sh@20 -- # IFS=: 00:06:43.574 16:15:11 -- accel/accel.sh@20 -- # read -r var val 00:06:43.574 16:15:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:43.574 16:15:11 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:43.574 16:15:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.574 00:06:43.574 real 0m2.577s 00:06:43.574 user 0m2.340s 00:06:43.574 sys 0m0.247s 00:06:43.574 16:15:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.574 16:15:11 -- common/autotest_common.sh@10 -- # set +x 00:06:43.574 ************************************ 00:06:43.574 END TEST accel_crc32c_C2 00:06:43.574 ************************************ 00:06:43.574 16:15:12 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:43.574 16:15:12 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:43.574 16:15:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:43.574 16:15:12 -- common/autotest_common.sh@10 -- # set +x 00:06:43.574 ************************************ 00:06:43.574 START TEST accel_copy 00:06:43.574 ************************************ 00:06:43.574 16:15:12 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:43.574 16:15:12 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.574 16:15:12 -- accel/accel.sh@17 -- # local accel_module 00:06:43.574 16:15:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:43.574 16:15:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:43.574 16:15:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.574 16:15:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.574 16:15:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.574 16:15:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.574 16:15:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.574 16:15:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.574 16:15:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.574 16:15:12 -- accel/accel.sh@42 -- # jq -r . 00:06:43.574 [2024-07-20 16:15:12.036417] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:43.574 [2024-07-20 16:15:12.036535] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2259812 ] 00:06:43.574 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.574 [2024-07-20 16:15:12.106547] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.574 [2024-07-20 16:15:12.141831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.510 16:15:13 -- accel/accel.sh@18 -- # out=' 00:06:44.510 SPDK Configuration: 00:06:44.511 Core mask: 0x1 00:06:44.511 00:06:44.511 Accel Perf Configuration: 00:06:44.511 Workload Type: copy 00:06:44.511 Transfer size: 4096 bytes 00:06:44.511 Vector count 1 00:06:44.511 Module: software 00:06:44.511 Queue depth: 32 00:06:44.511 Allocate depth: 32 00:06:44.511 # threads/core: 1 00:06:44.511 Run time: 1 seconds 00:06:44.511 Verify: Yes 00:06:44.511 00:06:44.511 Running for 1 seconds... 00:06:44.511 00:06:44.511 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:44.511 ------------------------------------------------------------------------------------ 00:06:44.511 0,0 548576/s 2142 MiB/s 0 0 00:06:44.511 ==================================================================================== 00:06:44.511 Total 548576/s 2142 MiB/s 0 0' 00:06:44.511 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.511 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.511 16:15:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:44.511 16:15:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:44.511 16:15:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.511 16:15:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.511 16:15:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.511 16:15:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.511 16:15:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.511 16:15:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.511 16:15:13 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.511 16:15:13 -- accel/accel.sh@42 -- # jq -r . 00:06:44.769 [2024-07-20 16:15:13.323461] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:44.769 [2024-07-20 16:15:13.323551] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2260078 ] 00:06:44.769 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.769 [2024-07-20 16:15:13.393268] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.769 [2024-07-20 16:15:13.428053] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.769 16:15:13 -- accel/accel.sh@21 -- # val= 00:06:44.769 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.769 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.769 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.769 16:15:13 -- accel/accel.sh@21 -- # val= 00:06:44.769 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.769 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.769 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.769 16:15:13 -- accel/accel.sh@21 -- # val=0x1 00:06:44.769 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val= 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val= 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val=copy 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val= 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val=software 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@23 -- # accel_module=software 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val=32 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val=32 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val=1 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val=Yes 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val= 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:44.770 16:15:13 -- accel/accel.sh@21 -- # val= 00:06:44.770 16:15:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # IFS=: 00:06:44.770 16:15:13 -- accel/accel.sh@20 -- # read -r var val 00:06:46.144 16:15:14 -- accel/accel.sh@21 -- # val= 00:06:46.144 16:15:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # IFS=: 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # read -r var val 00:06:46.144 16:15:14 -- accel/accel.sh@21 -- # val= 00:06:46.144 16:15:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # IFS=: 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # read -r var val 00:06:46.144 16:15:14 -- accel/accel.sh@21 -- # val= 00:06:46.144 16:15:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # IFS=: 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # read -r var val 00:06:46.144 16:15:14 -- accel/accel.sh@21 -- # val= 00:06:46.144 16:15:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # IFS=: 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # read -r var val 00:06:46.144 16:15:14 -- accel/accel.sh@21 -- # val= 00:06:46.144 16:15:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # IFS=: 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # read -r var val 00:06:46.144 16:15:14 -- accel/accel.sh@21 -- # val= 00:06:46.144 16:15:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.144 16:15:14 -- accel/accel.sh@20 -- # IFS=: 00:06:46.145 16:15:14 -- accel/accel.sh@20 -- # read -r var val 00:06:46.145 16:15:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:46.145 16:15:14 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:46.145 16:15:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.145 00:06:46.145 real 0m2.579s 00:06:46.145 user 0m2.322s 00:06:46.145 sys 0m0.266s 00:06:46.145 16:15:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.145 16:15:14 -- common/autotest_common.sh@10 -- # set +x 00:06:46.145 ************************************ 00:06:46.145 END TEST accel_copy 00:06:46.145 ************************************ 00:06:46.145 16:15:14 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.145 16:15:14 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:46.145 16:15:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:46.145 16:15:14 -- common/autotest_common.sh@10 -- # set +x 00:06:46.145 ************************************ 00:06:46.145 START TEST accel_fill 00:06:46.145 ************************************ 00:06:46.145 16:15:14 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.145 16:15:14 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.145 16:15:14 -- accel/accel.sh@17 -- # local accel_module 00:06:46.145 16:15:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.145 16:15:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.145 16:15:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.145 16:15:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.145 16:15:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.145 16:15:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.145 16:15:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.145 16:15:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.145 16:15:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.145 16:15:14 -- accel/accel.sh@42 -- # jq -r . 00:06:46.145 [2024-07-20 16:15:14.664943] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:46.145 [2024-07-20 16:15:14.665050] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2260307 ] 00:06:46.145 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.145 [2024-07-20 16:15:14.733255] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.145 [2024-07-20 16:15:14.768716] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.520 16:15:15 -- accel/accel.sh@18 -- # out=' 00:06:47.520 SPDK Configuration: 00:06:47.520 Core mask: 0x1 00:06:47.520 00:06:47.520 Accel Perf Configuration: 00:06:47.520 Workload Type: fill 00:06:47.520 Fill pattern: 0x80 00:06:47.520 Transfer size: 4096 bytes 00:06:47.520 Vector count 1 00:06:47.520 Module: software 00:06:47.520 Queue depth: 64 00:06:47.520 Allocate depth: 64 00:06:47.520 # threads/core: 1 00:06:47.520 Run time: 1 seconds 00:06:47.520 Verify: Yes 00:06:47.520 00:06:47.520 Running for 1 seconds... 00:06:47.520 00:06:47.520 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:47.520 ------------------------------------------------------------------------------------ 00:06:47.520 0,0 967488/s 3779 MiB/s 0 0 00:06:47.520 ==================================================================================== 00:06:47.520 Total 967488/s 3779 MiB/s 0 0' 00:06:47.520 16:15:15 -- accel/accel.sh@20 -- # IFS=: 00:06:47.520 16:15:15 -- accel/accel.sh@20 -- # read -r var val 00:06:47.520 16:15:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:47.520 16:15:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:47.520 16:15:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.520 16:15:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.520 16:15:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.520 16:15:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.520 16:15:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.520 16:15:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.520 16:15:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.520 16:15:15 -- accel/accel.sh@42 -- # jq -r . 00:06:47.520 [2024-07-20 16:15:15.949561] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:47.520 [2024-07-20 16:15:15.949677] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2260454 ] 00:06:47.520 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.520 [2024-07-20 16:15:16.017470] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.520 [2024-07-20 16:15:16.052165] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.520 16:15:16 -- accel/accel.sh@21 -- # val= 00:06:47.520 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.520 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.520 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.520 16:15:16 -- accel/accel.sh@21 -- # val= 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val=0x1 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val= 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val= 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val=fill 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val=0x80 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val= 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val=software 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@23 -- # accel_module=software 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val=64 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val=64 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val=1 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val=Yes 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val= 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:47.521 16:15:16 -- accel/accel.sh@21 -- # val= 00:06:47.521 16:15:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # IFS=: 00:06:47.521 16:15:16 -- accel/accel.sh@20 -- # read -r var val 00:06:48.456 16:15:17 -- accel/accel.sh@21 -- # val= 00:06:48.456 16:15:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.456 16:15:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.456 16:15:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.457 16:15:17 -- accel/accel.sh@21 -- # val= 00:06:48.457 16:15:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.457 16:15:17 -- accel/accel.sh@21 -- # val= 00:06:48.457 16:15:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.457 16:15:17 -- accel/accel.sh@21 -- # val= 00:06:48.457 16:15:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.457 16:15:17 -- accel/accel.sh@21 -- # val= 00:06:48.457 16:15:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.457 16:15:17 -- accel/accel.sh@21 -- # val= 00:06:48.457 16:15:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # IFS=: 00:06:48.457 16:15:17 -- accel/accel.sh@20 -- # read -r var val 00:06:48.457 16:15:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:48.457 16:15:17 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:48.457 16:15:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:48.457 00:06:48.457 real 0m2.574s 00:06:48.457 user 0m2.321s 00:06:48.457 sys 0m0.262s 00:06:48.457 16:15:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.457 16:15:17 -- common/autotest_common.sh@10 -- # set +x 00:06:48.457 ************************************ 00:06:48.457 END TEST accel_fill 00:06:48.457 ************************************ 00:06:48.716 16:15:17 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:48.716 16:15:17 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:48.716 16:15:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:48.716 16:15:17 -- common/autotest_common.sh@10 -- # set +x 00:06:48.716 ************************************ 00:06:48.716 START TEST accel_copy_crc32c 00:06:48.716 ************************************ 00:06:48.716 16:15:17 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:48.716 16:15:17 -- accel/accel.sh@16 -- # local accel_opc 00:06:48.716 16:15:17 -- accel/accel.sh@17 -- # local accel_module 00:06:48.716 16:15:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:48.716 16:15:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:48.716 16:15:17 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.716 16:15:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.716 16:15:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.716 16:15:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.716 16:15:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.716 16:15:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.716 16:15:17 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.716 16:15:17 -- accel/accel.sh@42 -- # jq -r . 00:06:48.716 [2024-07-20 16:15:17.288265] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:48.716 [2024-07-20 16:15:17.288360] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2260672 ] 00:06:48.716 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.716 [2024-07-20 16:15:17.358243] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.716 [2024-07-20 16:15:17.394367] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.090 16:15:18 -- accel/accel.sh@18 -- # out=' 00:06:50.090 SPDK Configuration: 00:06:50.090 Core mask: 0x1 00:06:50.090 00:06:50.090 Accel Perf Configuration: 00:06:50.090 Workload Type: copy_crc32c 00:06:50.090 CRC-32C seed: 0 00:06:50.090 Vector size: 4096 bytes 00:06:50.090 Transfer size: 4096 bytes 00:06:50.090 Vector count 1 00:06:50.090 Module: software 00:06:50.090 Queue depth: 32 00:06:50.090 Allocate depth: 32 00:06:50.090 # threads/core: 1 00:06:50.090 Run time: 1 seconds 00:06:50.090 Verify: Yes 00:06:50.090 00:06:50.090 Running for 1 seconds... 00:06:50.090 00:06:50.090 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:50.090 ------------------------------------------------------------------------------------ 00:06:50.090 0,0 435104/s 1699 MiB/s 0 0 00:06:50.090 ==================================================================================== 00:06:50.090 Total 435104/s 1699 MiB/s 0 0' 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:50.090 16:15:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:50.090 16:15:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.090 16:15:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.090 16:15:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.090 16:15:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.090 16:15:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.090 16:15:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.090 16:15:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.090 16:15:18 -- accel/accel.sh@42 -- # jq -r . 00:06:50.090 [2024-07-20 16:15:18.573198] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:50.090 [2024-07-20 16:15:18.573290] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2260940 ] 00:06:50.090 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.090 [2024-07-20 16:15:18.641373] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.090 [2024-07-20 16:15:18.675987] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val= 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val= 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val=0x1 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val= 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val= 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val=0 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val= 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val=software 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@23 -- # accel_module=software 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val=32 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val=32 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val=1 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val=Yes 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val= 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:50.090 16:15:18 -- accel/accel.sh@21 -- # val= 00:06:50.090 16:15:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # IFS=: 00:06:50.090 16:15:18 -- accel/accel.sh@20 -- # read -r var val 00:06:51.463 16:15:19 -- accel/accel.sh@21 -- # val= 00:06:51.463 16:15:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # IFS=: 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # read -r var val 00:06:51.463 16:15:19 -- accel/accel.sh@21 -- # val= 00:06:51.463 16:15:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # IFS=: 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # read -r var val 00:06:51.463 16:15:19 -- accel/accel.sh@21 -- # val= 00:06:51.463 16:15:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # IFS=: 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # read -r var val 00:06:51.463 16:15:19 -- accel/accel.sh@21 -- # val= 00:06:51.463 16:15:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # IFS=: 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # read -r var val 00:06:51.463 16:15:19 -- accel/accel.sh@21 -- # val= 00:06:51.463 16:15:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # IFS=: 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # read -r var val 00:06:51.463 16:15:19 -- accel/accel.sh@21 -- # val= 00:06:51.463 16:15:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # IFS=: 00:06:51.463 16:15:19 -- accel/accel.sh@20 -- # read -r var val 00:06:51.463 16:15:19 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.463 16:15:19 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:51.463 16:15:19 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.463 00:06:51.463 real 0m2.574s 00:06:51.463 user 0m2.323s 00:06:51.463 sys 0m0.261s 00:06:51.463 16:15:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.463 16:15:19 -- common/autotest_common.sh@10 -- # set +x 00:06:51.463 ************************************ 00:06:51.463 END TEST accel_copy_crc32c 00:06:51.463 ************************************ 00:06:51.463 16:15:19 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:51.463 16:15:19 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:51.463 16:15:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:51.463 16:15:19 -- common/autotest_common.sh@10 -- # set +x 00:06:51.463 ************************************ 00:06:51.464 START TEST accel_copy_crc32c_C2 00:06:51.464 ************************************ 00:06:51.464 16:15:19 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:51.464 16:15:19 -- accel/accel.sh@16 -- # local accel_opc 00:06:51.464 16:15:19 -- accel/accel.sh@17 -- # local accel_module 00:06:51.464 16:15:19 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:51.464 16:15:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:51.464 16:15:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.464 16:15:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.464 16:15:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.464 16:15:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.464 16:15:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.464 16:15:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.464 16:15:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.464 16:15:19 -- accel/accel.sh@42 -- # jq -r . 00:06:51.464 [2024-07-20 16:15:19.912666] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:51.464 [2024-07-20 16:15:19.912756] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2261221 ] 00:06:51.464 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.464 [2024-07-20 16:15:19.982853] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.464 [2024-07-20 16:15:20.018955] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.397 16:15:21 -- accel/accel.sh@18 -- # out=' 00:06:52.397 SPDK Configuration: 00:06:52.397 Core mask: 0x1 00:06:52.397 00:06:52.397 Accel Perf Configuration: 00:06:52.397 Workload Type: copy_crc32c 00:06:52.397 CRC-32C seed: 0 00:06:52.397 Vector size: 4096 bytes 00:06:52.397 Transfer size: 8192 bytes 00:06:52.397 Vector count 2 00:06:52.397 Module: software 00:06:52.397 Queue depth: 32 00:06:52.397 Allocate depth: 32 00:06:52.397 # threads/core: 1 00:06:52.397 Run time: 1 seconds 00:06:52.397 Verify: Yes 00:06:52.397 00:06:52.397 Running for 1 seconds... 00:06:52.397 00:06:52.397 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:52.397 ------------------------------------------------------------------------------------ 00:06:52.397 0,0 303296/s 2369 MiB/s 0 0 00:06:52.397 ==================================================================================== 00:06:52.397 Total 303296/s 1184 MiB/s 0 0' 00:06:52.397 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.397 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.397 16:15:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:52.397 16:15:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:52.397 16:15:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.397 16:15:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.397 16:15:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.397 16:15:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.397 16:15:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.397 16:15:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.397 16:15:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.397 16:15:21 -- accel/accel.sh@42 -- # jq -r . 00:06:52.656 [2024-07-20 16:15:21.201535] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:52.656 [2024-07-20 16:15:21.201651] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2261489 ] 00:06:52.656 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.656 [2024-07-20 16:15:21.270925] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.656 [2024-07-20 16:15:21.305600] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val= 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val= 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val=0x1 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val= 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val= 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val=0 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val= 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val=software 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@23 -- # accel_module=software 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val=32 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val=32 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val=1 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val=Yes 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val= 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:52.656 16:15:21 -- accel/accel.sh@21 -- # val= 00:06:52.656 16:15:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # IFS=: 00:06:52.656 16:15:21 -- accel/accel.sh@20 -- # read -r var val 00:06:54.033 16:15:22 -- accel/accel.sh@21 -- # val= 00:06:54.033 16:15:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # IFS=: 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # read -r var val 00:06:54.033 16:15:22 -- accel/accel.sh@21 -- # val= 00:06:54.033 16:15:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # IFS=: 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # read -r var val 00:06:54.033 16:15:22 -- accel/accel.sh@21 -- # val= 00:06:54.033 16:15:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # IFS=: 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # read -r var val 00:06:54.033 16:15:22 -- accel/accel.sh@21 -- # val= 00:06:54.033 16:15:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # IFS=: 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # read -r var val 00:06:54.033 16:15:22 -- accel/accel.sh@21 -- # val= 00:06:54.033 16:15:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # IFS=: 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # read -r var val 00:06:54.033 16:15:22 -- accel/accel.sh@21 -- # val= 00:06:54.033 16:15:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # IFS=: 00:06:54.033 16:15:22 -- accel/accel.sh@20 -- # read -r var val 00:06:54.033 16:15:22 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:54.033 16:15:22 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:54.033 16:15:22 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.033 00:06:54.033 real 0m2.582s 00:06:54.033 user 0m2.328s 00:06:54.033 sys 0m0.263s 00:06:54.033 16:15:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.033 16:15:22 -- common/autotest_common.sh@10 -- # set +x 00:06:54.033 ************************************ 00:06:54.033 END TEST accel_copy_crc32c_C2 00:06:54.033 ************************************ 00:06:54.033 16:15:22 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:54.033 16:15:22 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:54.033 16:15:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:54.033 16:15:22 -- common/autotest_common.sh@10 -- # set +x 00:06:54.033 ************************************ 00:06:54.033 START TEST accel_dualcast 00:06:54.033 ************************************ 00:06:54.033 16:15:22 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:54.033 16:15:22 -- accel/accel.sh@16 -- # local accel_opc 00:06:54.033 16:15:22 -- accel/accel.sh@17 -- # local accel_module 00:06:54.033 16:15:22 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:54.033 16:15:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:54.033 16:15:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.033 16:15:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.033 16:15:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.033 16:15:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.033 16:15:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.033 16:15:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.033 16:15:22 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.033 16:15:22 -- accel/accel.sh@42 -- # jq -r . 00:06:54.033 [2024-07-20 16:15:22.541309] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:54.033 [2024-07-20 16:15:22.541396] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2261776 ] 00:06:54.033 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.033 [2024-07-20 16:15:22.609609] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.033 [2024-07-20 16:15:22.644934] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.457 16:15:23 -- accel/accel.sh@18 -- # out=' 00:06:55.457 SPDK Configuration: 00:06:55.457 Core mask: 0x1 00:06:55.457 00:06:55.457 Accel Perf Configuration: 00:06:55.457 Workload Type: dualcast 00:06:55.457 Transfer size: 4096 bytes 00:06:55.457 Vector count 1 00:06:55.457 Module: software 00:06:55.457 Queue depth: 32 00:06:55.457 Allocate depth: 32 00:06:55.457 # threads/core: 1 00:06:55.457 Run time: 1 seconds 00:06:55.457 Verify: Yes 00:06:55.457 00:06:55.457 Running for 1 seconds... 00:06:55.457 00:06:55.457 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:55.457 ------------------------------------------------------------------------------------ 00:06:55.457 0,0 677024/s 2644 MiB/s 0 0 00:06:55.457 ==================================================================================== 00:06:55.457 Total 677024/s 2644 MiB/s 0 0' 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:55.457 16:15:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:55.457 16:15:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.457 16:15:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.457 16:15:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.457 16:15:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.457 16:15:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.457 16:15:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.457 16:15:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.457 16:15:23 -- accel/accel.sh@42 -- # jq -r . 00:06:55.457 [2024-07-20 16:15:23.825947] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:55.457 [2024-07-20 16:15:23.826038] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2261932 ] 00:06:55.457 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.457 [2024-07-20 16:15:23.894472] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.457 [2024-07-20 16:15:23.929148] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val= 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val= 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val=0x1 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val= 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val= 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val=dualcast 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val= 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val=software 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@23 -- # accel_module=software 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val=32 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.457 16:15:23 -- accel/accel.sh@21 -- # val=32 00:06:55.457 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.457 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.458 16:15:23 -- accel/accel.sh@21 -- # val=1 00:06:55.458 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.458 16:15:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:55.458 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.458 16:15:23 -- accel/accel.sh@21 -- # val=Yes 00:06:55.458 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.458 16:15:23 -- accel/accel.sh@21 -- # val= 00:06:55.458 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:55.458 16:15:23 -- accel/accel.sh@21 -- # val= 00:06:55.458 16:15:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # IFS=: 00:06:55.458 16:15:23 -- accel/accel.sh@20 -- # read -r var val 00:06:56.416 16:15:25 -- accel/accel.sh@21 -- # val= 00:06:56.416 16:15:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # IFS=: 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # read -r var val 00:06:56.416 16:15:25 -- accel/accel.sh@21 -- # val= 00:06:56.416 16:15:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # IFS=: 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # read -r var val 00:06:56.416 16:15:25 -- accel/accel.sh@21 -- # val= 00:06:56.416 16:15:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # IFS=: 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # read -r var val 00:06:56.416 16:15:25 -- accel/accel.sh@21 -- # val= 00:06:56.416 16:15:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # IFS=: 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # read -r var val 00:06:56.416 16:15:25 -- accel/accel.sh@21 -- # val= 00:06:56.416 16:15:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # IFS=: 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # read -r var val 00:06:56.416 16:15:25 -- accel/accel.sh@21 -- # val= 00:06:56.416 16:15:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # IFS=: 00:06:56.416 16:15:25 -- accel/accel.sh@20 -- # read -r var val 00:06:56.416 16:15:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:56.416 16:15:25 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:56.416 16:15:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.416 00:06:56.416 real 0m2.575s 00:06:56.416 user 0m2.321s 00:06:56.416 sys 0m0.263s 00:06:56.416 16:15:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.416 16:15:25 -- common/autotest_common.sh@10 -- # set +x 00:06:56.416 ************************************ 00:06:56.416 END TEST accel_dualcast 00:06:56.416 ************************************ 00:06:56.416 16:15:25 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:56.416 16:15:25 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:56.416 16:15:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:56.416 16:15:25 -- common/autotest_common.sh@10 -- # set +x 00:06:56.416 ************************************ 00:06:56.416 START TEST accel_compare 00:06:56.417 ************************************ 00:06:56.417 16:15:25 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:06:56.417 16:15:25 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.417 16:15:25 -- accel/accel.sh@17 -- # local accel_module 00:06:56.417 16:15:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:56.417 16:15:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:56.417 16:15:25 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.417 16:15:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.417 16:15:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.417 16:15:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.417 16:15:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.417 16:15:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.417 16:15:25 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.417 16:15:25 -- accel/accel.sh@42 -- # jq -r . 00:06:56.417 [2024-07-20 16:15:25.167942] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:56.417 [2024-07-20 16:15:25.168034] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2262121 ] 00:06:56.417 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.674 [2024-07-20 16:15:25.238898] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.674 [2024-07-20 16:15:25.275404] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.050 16:15:26 -- accel/accel.sh@18 -- # out=' 00:06:58.050 SPDK Configuration: 00:06:58.050 Core mask: 0x1 00:06:58.050 00:06:58.050 Accel Perf Configuration: 00:06:58.050 Workload Type: compare 00:06:58.050 Transfer size: 4096 bytes 00:06:58.050 Vector count 1 00:06:58.050 Module: software 00:06:58.050 Queue depth: 32 00:06:58.050 Allocate depth: 32 00:06:58.050 # threads/core: 1 00:06:58.050 Run time: 1 seconds 00:06:58.050 Verify: Yes 00:06:58.050 00:06:58.050 Running for 1 seconds... 00:06:58.050 00:06:58.050 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:58.050 ------------------------------------------------------------------------------------ 00:06:58.050 0,0 829984/s 3242 MiB/s 0 0 00:06:58.050 ==================================================================================== 00:06:58.051 Total 829984/s 3242 MiB/s 0 0' 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:58.051 16:15:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:58.051 16:15:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.051 16:15:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.051 16:15:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.051 16:15:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.051 16:15:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.051 16:15:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.051 16:15:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.051 16:15:26 -- accel/accel.sh@42 -- # jq -r . 00:06:58.051 [2024-07-20 16:15:26.456055] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:58.051 [2024-07-20 16:15:26.456145] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2262360 ] 00:06:58.051 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.051 [2024-07-20 16:15:26.523958] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.051 [2024-07-20 16:15:26.558577] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val= 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val= 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val=0x1 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val= 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val= 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val=compare 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val= 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val=software 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@23 -- # accel_module=software 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val=32 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val=32 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val=1 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val=Yes 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val= 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.051 16:15:26 -- accel/accel.sh@21 -- # val= 00:06:58.051 16:15:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # IFS=: 00:06:58.051 16:15:26 -- accel/accel.sh@20 -- # read -r var val 00:06:58.985 16:15:27 -- accel/accel.sh@21 -- # val= 00:06:58.985 16:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # IFS=: 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # read -r var val 00:06:58.985 16:15:27 -- accel/accel.sh@21 -- # val= 00:06:58.985 16:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # IFS=: 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # read -r var val 00:06:58.985 16:15:27 -- accel/accel.sh@21 -- # val= 00:06:58.985 16:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # IFS=: 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # read -r var val 00:06:58.985 16:15:27 -- accel/accel.sh@21 -- # val= 00:06:58.985 16:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # IFS=: 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # read -r var val 00:06:58.985 16:15:27 -- accel/accel.sh@21 -- # val= 00:06:58.985 16:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # IFS=: 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # read -r var val 00:06:58.985 16:15:27 -- accel/accel.sh@21 -- # val= 00:06:58.985 16:15:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # IFS=: 00:06:58.985 16:15:27 -- accel/accel.sh@20 -- # read -r var val 00:06:58.985 16:15:27 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:58.985 16:15:27 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:58.985 16:15:27 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.985 00:06:58.985 real 0m2.578s 00:06:58.985 user 0m2.331s 00:06:58.985 sys 0m0.256s 00:06:58.985 16:15:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.985 16:15:27 -- common/autotest_common.sh@10 -- # set +x 00:06:58.985 ************************************ 00:06:58.985 END TEST accel_compare 00:06:58.985 ************************************ 00:06:58.985 16:15:27 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:58.985 16:15:27 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:58.985 16:15:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:58.985 16:15:27 -- common/autotest_common.sh@10 -- # set +x 00:06:58.985 ************************************ 00:06:58.985 START TEST accel_xor 00:06:58.985 ************************************ 00:06:58.985 16:15:27 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:06:58.985 16:15:27 -- accel/accel.sh@16 -- # local accel_opc 00:06:58.985 16:15:27 -- accel/accel.sh@17 -- # local accel_module 00:06:58.985 16:15:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:58.985 16:15:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:58.985 16:15:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.985 16:15:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.985 16:15:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.985 16:15:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.985 16:15:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.986 16:15:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.986 16:15:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.986 16:15:27 -- accel/accel.sh@42 -- # jq -r . 00:06:59.244 [2024-07-20 16:15:27.792695] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:06:59.244 [2024-07-20 16:15:27.792787] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2262644 ] 00:06:59.244 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.244 [2024-07-20 16:15:27.862876] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.244 [2024-07-20 16:15:27.898739] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.618 16:15:29 -- accel/accel.sh@18 -- # out=' 00:07:00.618 SPDK Configuration: 00:07:00.618 Core mask: 0x1 00:07:00.618 00:07:00.618 Accel Perf Configuration: 00:07:00.618 Workload Type: xor 00:07:00.618 Source buffers: 2 00:07:00.618 Transfer size: 4096 bytes 00:07:00.618 Vector count 1 00:07:00.618 Module: software 00:07:00.618 Queue depth: 32 00:07:00.618 Allocate depth: 32 00:07:00.618 # threads/core: 1 00:07:00.618 Run time: 1 seconds 00:07:00.618 Verify: Yes 00:07:00.618 00:07:00.618 Running for 1 seconds... 00:07:00.618 00:07:00.618 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:00.618 ------------------------------------------------------------------------------------ 00:07:00.618 0,0 693984/s 2710 MiB/s 0 0 00:07:00.618 ==================================================================================== 00:07:00.618 Total 693984/s 2710 MiB/s 0 0' 00:07:00.618 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:00.619 16:15:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:00.619 16:15:29 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.619 16:15:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.619 16:15:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.619 16:15:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.619 16:15:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.619 16:15:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.619 16:15:29 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.619 16:15:29 -- accel/accel.sh@42 -- # jq -r . 00:07:00.619 [2024-07-20 16:15:29.079291] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:00.619 [2024-07-20 16:15:29.079408] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2262913 ] 00:07:00.619 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.619 [2024-07-20 16:15:29.148857] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.619 [2024-07-20 16:15:29.183190] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val= 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val= 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val=0x1 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val= 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val= 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val=xor 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val=2 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val= 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val=software 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@23 -- # accel_module=software 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val=32 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val=32 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val=1 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val=Yes 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val= 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:00.619 16:15:29 -- accel/accel.sh@21 -- # val= 00:07:00.619 16:15:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # IFS=: 00:07:00.619 16:15:29 -- accel/accel.sh@20 -- # read -r var val 00:07:01.553 16:15:30 -- accel/accel.sh@21 -- # val= 00:07:01.553 16:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.553 16:15:30 -- accel/accel.sh@21 -- # val= 00:07:01.553 16:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.553 16:15:30 -- accel/accel.sh@21 -- # val= 00:07:01.553 16:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.553 16:15:30 -- accel/accel.sh@21 -- # val= 00:07:01.553 16:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.553 16:15:30 -- accel/accel.sh@21 -- # val= 00:07:01.553 16:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.553 16:15:30 -- accel/accel.sh@21 -- # val= 00:07:01.553 16:15:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # IFS=: 00:07:01.553 16:15:30 -- accel/accel.sh@20 -- # read -r var val 00:07:01.553 16:15:30 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:01.553 16:15:30 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:01.553 16:15:30 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.553 00:07:01.553 real 0m2.577s 00:07:01.553 user 0m2.332s 00:07:01.553 sys 0m0.252s 00:07:01.553 16:15:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.553 16:15:30 -- common/autotest_common.sh@10 -- # set +x 00:07:01.553 ************************************ 00:07:01.553 END TEST accel_xor 00:07:01.553 ************************************ 00:07:01.811 16:15:30 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:01.811 16:15:30 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:01.811 16:15:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:01.811 16:15:30 -- common/autotest_common.sh@10 -- # set +x 00:07:01.811 ************************************ 00:07:01.811 START TEST accel_xor 00:07:01.811 ************************************ 00:07:01.811 16:15:30 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:07:01.811 16:15:30 -- accel/accel.sh@16 -- # local accel_opc 00:07:01.811 16:15:30 -- accel/accel.sh@17 -- # local accel_module 00:07:01.811 16:15:30 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:07:01.811 16:15:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:01.811 16:15:30 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.811 16:15:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.811 16:15:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.811 16:15:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.811 16:15:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.811 16:15:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.811 16:15:30 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.811 16:15:30 -- accel/accel.sh@42 -- # jq -r . 00:07:01.811 [2024-07-20 16:15:30.419471] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:01.811 [2024-07-20 16:15:30.419561] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2263197 ] 00:07:01.811 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.811 [2024-07-20 16:15:30.488157] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.811 [2024-07-20 16:15:30.523399] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.188 16:15:31 -- accel/accel.sh@18 -- # out=' 00:07:03.188 SPDK Configuration: 00:07:03.188 Core mask: 0x1 00:07:03.188 00:07:03.188 Accel Perf Configuration: 00:07:03.188 Workload Type: xor 00:07:03.188 Source buffers: 3 00:07:03.188 Transfer size: 4096 bytes 00:07:03.188 Vector count 1 00:07:03.188 Module: software 00:07:03.188 Queue depth: 32 00:07:03.188 Allocate depth: 32 00:07:03.188 # threads/core: 1 00:07:03.188 Run time: 1 seconds 00:07:03.188 Verify: Yes 00:07:03.188 00:07:03.188 Running for 1 seconds... 00:07:03.188 00:07:03.188 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:03.188 ------------------------------------------------------------------------------------ 00:07:03.188 0,0 652480/s 2548 MiB/s 0 0 00:07:03.188 ==================================================================================== 00:07:03.188 Total 652480/s 2548 MiB/s 0 0' 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.188 16:15:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:03.188 16:15:31 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.188 16:15:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:03.188 16:15:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.188 16:15:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.188 16:15:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.188 16:15:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.188 16:15:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.188 16:15:31 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.188 16:15:31 -- accel/accel.sh@42 -- # jq -r . 00:07:03.188 [2024-07-20 16:15:31.703296] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:03.188 [2024-07-20 16:15:31.703388] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2263436 ] 00:07:03.188 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.188 [2024-07-20 16:15:31.772141] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.188 [2024-07-20 16:15:31.806510] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.188 16:15:31 -- accel/accel.sh@21 -- # val= 00:07:03.188 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.188 16:15:31 -- accel/accel.sh@21 -- # val= 00:07:03.188 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.188 16:15:31 -- accel/accel.sh@21 -- # val=0x1 00:07:03.188 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.188 16:15:31 -- accel/accel.sh@21 -- # val= 00:07:03.188 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.188 16:15:31 -- accel/accel.sh@21 -- # val= 00:07:03.188 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.188 16:15:31 -- accel/accel.sh@21 -- # val=xor 00:07:03.188 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.188 16:15:31 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.188 16:15:31 -- accel/accel.sh@21 -- # val=3 00:07:03.188 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.188 16:15:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.188 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.188 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val= 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val=software 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@23 -- # accel_module=software 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val=32 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val=32 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val=1 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val=Yes 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val= 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:03.189 16:15:31 -- accel/accel.sh@21 -- # val= 00:07:03.189 16:15:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # IFS=: 00:07:03.189 16:15:31 -- accel/accel.sh@20 -- # read -r var val 00:07:04.565 16:15:32 -- accel/accel.sh@21 -- # val= 00:07:04.565 16:15:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # IFS=: 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # read -r var val 00:07:04.565 16:15:32 -- accel/accel.sh@21 -- # val= 00:07:04.565 16:15:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # IFS=: 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # read -r var val 00:07:04.565 16:15:32 -- accel/accel.sh@21 -- # val= 00:07:04.565 16:15:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # IFS=: 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # read -r var val 00:07:04.565 16:15:32 -- accel/accel.sh@21 -- # val= 00:07:04.565 16:15:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # IFS=: 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # read -r var val 00:07:04.565 16:15:32 -- accel/accel.sh@21 -- # val= 00:07:04.565 16:15:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # IFS=: 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # read -r var val 00:07:04.565 16:15:32 -- accel/accel.sh@21 -- # val= 00:07:04.565 16:15:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # IFS=: 00:07:04.565 16:15:32 -- accel/accel.sh@20 -- # read -r var val 00:07:04.565 16:15:32 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.565 16:15:32 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:04.565 16:15:32 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.565 00:07:04.565 real 0m2.576s 00:07:04.565 user 0m2.324s 00:07:04.565 sys 0m0.261s 00:07:04.565 16:15:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.565 16:15:32 -- common/autotest_common.sh@10 -- # set +x 00:07:04.565 ************************************ 00:07:04.565 END TEST accel_xor 00:07:04.565 ************************************ 00:07:04.565 16:15:33 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:04.565 16:15:33 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:04.565 16:15:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:04.565 16:15:33 -- common/autotest_common.sh@10 -- # set +x 00:07:04.565 ************************************ 00:07:04.565 START TEST accel_dif_verify 00:07:04.565 ************************************ 00:07:04.565 16:15:33 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:07:04.565 16:15:33 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.565 16:15:33 -- accel/accel.sh@17 -- # local accel_module 00:07:04.565 16:15:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:04.565 16:15:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:04.565 16:15:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.565 16:15:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.565 16:15:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.565 16:15:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.565 16:15:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.565 16:15:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.565 16:15:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.565 16:15:33 -- accel/accel.sh@42 -- # jq -r . 00:07:04.565 [2024-07-20 16:15:33.041286] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:04.565 [2024-07-20 16:15:33.041375] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2263627 ] 00:07:04.565 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.565 [2024-07-20 16:15:33.109319] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.565 [2024-07-20 16:15:33.144878] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.500 16:15:34 -- accel/accel.sh@18 -- # out=' 00:07:05.500 SPDK Configuration: 00:07:05.500 Core mask: 0x1 00:07:05.500 00:07:05.500 Accel Perf Configuration: 00:07:05.500 Workload Type: dif_verify 00:07:05.500 Vector size: 4096 bytes 00:07:05.500 Transfer size: 4096 bytes 00:07:05.500 Block size: 512 bytes 00:07:05.500 Metadata size: 8 bytes 00:07:05.500 Vector count 1 00:07:05.500 Module: software 00:07:05.500 Queue depth: 32 00:07:05.500 Allocate depth: 32 00:07:05.500 # threads/core: 1 00:07:05.500 Run time: 1 seconds 00:07:05.500 Verify: No 00:07:05.500 00:07:05.500 Running for 1 seconds... 00:07:05.501 00:07:05.501 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:05.501 ------------------------------------------------------------------------------------ 00:07:05.501 0,0 240160/s 952 MiB/s 0 0 00:07:05.501 ==================================================================================== 00:07:05.501 Total 240160/s 938 MiB/s 0 0' 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.759 16:15:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:05.759 16:15:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:05.759 16:15:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.759 16:15:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.759 16:15:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.759 16:15:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.759 16:15:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.759 16:15:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.759 16:15:34 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.759 16:15:34 -- accel/accel.sh@42 -- # jq -r . 00:07:05.759 [2024-07-20 16:15:34.323562] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:05.759 [2024-07-20 16:15:34.323652] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2263779 ] 00:07:05.759 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.759 [2024-07-20 16:15:34.394152] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.759 [2024-07-20 16:15:34.429075] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.759 16:15:34 -- accel/accel.sh@21 -- # val= 00:07:05.759 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.759 16:15:34 -- accel/accel.sh@21 -- # val= 00:07:05.759 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.759 16:15:34 -- accel/accel.sh@21 -- # val=0x1 00:07:05.759 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.759 16:15:34 -- accel/accel.sh@21 -- # val= 00:07:05.759 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.759 16:15:34 -- accel/accel.sh@21 -- # val= 00:07:05.759 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.759 16:15:34 -- accel/accel.sh@21 -- # val=dif_verify 00:07:05.759 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.759 16:15:34 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.759 16:15:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:05.759 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.759 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.759 16:15:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:05.759 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val= 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val=software 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@23 -- # accel_module=software 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val=32 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val=32 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val=1 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val=No 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val= 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:05.760 16:15:34 -- accel/accel.sh@21 -- # val= 00:07:05.760 16:15:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # IFS=: 00:07:05.760 16:15:34 -- accel/accel.sh@20 -- # read -r var val 00:07:07.137 16:15:35 -- accel/accel.sh@21 -- # val= 00:07:07.137 16:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.137 16:15:35 -- accel/accel.sh@20 -- # IFS=: 00:07:07.137 16:15:35 -- accel/accel.sh@20 -- # read -r var val 00:07:07.137 16:15:35 -- accel/accel.sh@21 -- # val= 00:07:07.137 16:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.137 16:15:35 -- accel/accel.sh@20 -- # IFS=: 00:07:07.137 16:15:35 -- accel/accel.sh@20 -- # read -r var val 00:07:07.137 16:15:35 -- accel/accel.sh@21 -- # val= 00:07:07.138 16:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.138 16:15:35 -- accel/accel.sh@20 -- # IFS=: 00:07:07.138 16:15:35 -- accel/accel.sh@20 -- # read -r var val 00:07:07.138 16:15:35 -- accel/accel.sh@21 -- # val= 00:07:07.138 16:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.138 16:15:35 -- accel/accel.sh@20 -- # IFS=: 00:07:07.138 16:15:35 -- accel/accel.sh@20 -- # read -r var val 00:07:07.138 16:15:35 -- accel/accel.sh@21 -- # val= 00:07:07.138 16:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.138 16:15:35 -- accel/accel.sh@20 -- # IFS=: 00:07:07.138 16:15:35 -- accel/accel.sh@20 -- # read -r var val 00:07:07.138 16:15:35 -- accel/accel.sh@21 -- # val= 00:07:07.138 16:15:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.138 16:15:35 -- accel/accel.sh@20 -- # IFS=: 00:07:07.138 16:15:35 -- accel/accel.sh@20 -- # read -r var val 00:07:07.138 16:15:35 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:07.138 16:15:35 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:07.138 16:15:35 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.138 00:07:07.138 real 0m2.572s 00:07:07.138 user 0m2.316s 00:07:07.138 sys 0m0.266s 00:07:07.138 16:15:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.138 16:15:35 -- common/autotest_common.sh@10 -- # set +x 00:07:07.138 ************************************ 00:07:07.138 END TEST accel_dif_verify 00:07:07.138 ************************************ 00:07:07.138 16:15:35 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:07.138 16:15:35 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:07.138 16:15:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:07.138 16:15:35 -- common/autotest_common.sh@10 -- # set +x 00:07:07.138 ************************************ 00:07:07.138 START TEST accel_dif_generate 00:07:07.138 ************************************ 00:07:07.138 16:15:35 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:07:07.138 16:15:35 -- accel/accel.sh@16 -- # local accel_opc 00:07:07.138 16:15:35 -- accel/accel.sh@17 -- # local accel_module 00:07:07.138 16:15:35 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:07.138 16:15:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:07.138 16:15:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.138 16:15:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.138 16:15:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.138 16:15:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.138 16:15:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.138 16:15:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.138 16:15:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.138 16:15:35 -- accel/accel.sh@42 -- # jq -r . 00:07:07.138 [2024-07-20 16:15:35.663036] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:07.138 [2024-07-20 16:15:35.663143] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264063 ] 00:07:07.138 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.138 [2024-07-20 16:15:35.731548] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.138 [2024-07-20 16:15:35.766649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.515 16:15:36 -- accel/accel.sh@18 -- # out=' 00:07:08.515 SPDK Configuration: 00:07:08.515 Core mask: 0x1 00:07:08.515 00:07:08.515 Accel Perf Configuration: 00:07:08.515 Workload Type: dif_generate 00:07:08.515 Vector size: 4096 bytes 00:07:08.515 Transfer size: 4096 bytes 00:07:08.515 Block size: 512 bytes 00:07:08.515 Metadata size: 8 bytes 00:07:08.515 Vector count 1 00:07:08.515 Module: software 00:07:08.515 Queue depth: 32 00:07:08.515 Allocate depth: 32 00:07:08.515 # threads/core: 1 00:07:08.515 Run time: 1 seconds 00:07:08.515 Verify: No 00:07:08.515 00:07:08.515 Running for 1 seconds... 00:07:08.515 00:07:08.515 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:08.515 ------------------------------------------------------------------------------------ 00:07:08.515 0,0 294656/s 1168 MiB/s 0 0 00:07:08.515 ==================================================================================== 00:07:08.515 Total 294656/s 1151 MiB/s 0 0' 00:07:08.515 16:15:36 -- accel/accel.sh@20 -- # IFS=: 00:07:08.515 16:15:36 -- accel/accel.sh@20 -- # read -r var val 00:07:08.515 16:15:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:08.515 16:15:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:08.515 16:15:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.515 16:15:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.515 16:15:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.515 16:15:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.515 16:15:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.515 16:15:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.515 16:15:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.515 16:15:36 -- accel/accel.sh@42 -- # jq -r . 00:07:08.515 [2024-07-20 16:15:36.946084] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:08.515 [2024-07-20 16:15:36.946191] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264331 ] 00:07:08.515 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.515 [2024-07-20 16:15:37.015211] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.515 [2024-07-20 16:15:37.049641] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.515 16:15:37 -- accel/accel.sh@21 -- # val= 00:07:08.515 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val= 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val=0x1 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val= 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val= 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val=dif_generate 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val= 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val=software 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@23 -- # accel_module=software 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val=32 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val=32 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val=1 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val=No 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val= 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:08.516 16:15:37 -- accel/accel.sh@21 -- # val= 00:07:08.516 16:15:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # IFS=: 00:07:08.516 16:15:37 -- accel/accel.sh@20 -- # read -r var val 00:07:09.452 16:15:38 -- accel/accel.sh@21 -- # val= 00:07:09.453 16:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # IFS=: 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # read -r var val 00:07:09.453 16:15:38 -- accel/accel.sh@21 -- # val= 00:07:09.453 16:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # IFS=: 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # read -r var val 00:07:09.453 16:15:38 -- accel/accel.sh@21 -- # val= 00:07:09.453 16:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # IFS=: 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # read -r var val 00:07:09.453 16:15:38 -- accel/accel.sh@21 -- # val= 00:07:09.453 16:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # IFS=: 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # read -r var val 00:07:09.453 16:15:38 -- accel/accel.sh@21 -- # val= 00:07:09.453 16:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # IFS=: 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # read -r var val 00:07:09.453 16:15:38 -- accel/accel.sh@21 -- # val= 00:07:09.453 16:15:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # IFS=: 00:07:09.453 16:15:38 -- accel/accel.sh@20 -- # read -r var val 00:07:09.453 16:15:38 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:09.453 16:15:38 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:09.453 16:15:38 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.453 00:07:09.453 real 0m2.572s 00:07:09.453 user 0m2.328s 00:07:09.453 sys 0m0.254s 00:07:09.453 16:15:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.453 16:15:38 -- common/autotest_common.sh@10 -- # set +x 00:07:09.453 ************************************ 00:07:09.453 END TEST accel_dif_generate 00:07:09.453 ************************************ 00:07:09.711 16:15:38 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:09.711 16:15:38 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:09.711 16:15:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.711 16:15:38 -- common/autotest_common.sh@10 -- # set +x 00:07:09.711 ************************************ 00:07:09.711 START TEST accel_dif_generate_copy 00:07:09.711 ************************************ 00:07:09.711 16:15:38 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:07:09.711 16:15:38 -- accel/accel.sh@16 -- # local accel_opc 00:07:09.711 16:15:38 -- accel/accel.sh@17 -- # local accel_module 00:07:09.711 16:15:38 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:09.711 16:15:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:09.711 16:15:38 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.711 16:15:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.711 16:15:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.711 16:15:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.711 16:15:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.711 16:15:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.711 16:15:38 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.711 16:15:38 -- accel/accel.sh@42 -- # jq -r . 00:07:09.711 [2024-07-20 16:15:38.285738] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:09.711 [2024-07-20 16:15:38.285827] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264612 ] 00:07:09.711 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.711 [2024-07-20 16:15:38.353742] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.711 [2024-07-20 16:15:38.389662] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.085 16:15:39 -- accel/accel.sh@18 -- # out=' 00:07:11.085 SPDK Configuration: 00:07:11.085 Core mask: 0x1 00:07:11.085 00:07:11.085 Accel Perf Configuration: 00:07:11.085 Workload Type: dif_generate_copy 00:07:11.085 Vector size: 4096 bytes 00:07:11.085 Transfer size: 4096 bytes 00:07:11.085 Vector count 1 00:07:11.085 Module: software 00:07:11.085 Queue depth: 32 00:07:11.085 Allocate depth: 32 00:07:11.085 # threads/core: 1 00:07:11.085 Run time: 1 seconds 00:07:11.085 Verify: No 00:07:11.085 00:07:11.085 Running for 1 seconds... 00:07:11.085 00:07:11.085 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:11.085 ------------------------------------------------------------------------------------ 00:07:11.085 0,0 228064/s 904 MiB/s 0 0 00:07:11.085 ==================================================================================== 00:07:11.085 Total 228064/s 890 MiB/s 0 0' 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.085 16:15:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:11.085 16:15:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:11.085 16:15:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.085 16:15:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.085 16:15:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.085 16:15:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.085 16:15:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.085 16:15:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.085 16:15:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.085 16:15:39 -- accel/accel.sh@42 -- # jq -r . 00:07:11.085 [2024-07-20 16:15:39.570029] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:11.085 [2024-07-20 16:15:39.570120] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264880 ] 00:07:11.085 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.085 [2024-07-20 16:15:39.638369] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.085 [2024-07-20 16:15:39.673021] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.085 16:15:39 -- accel/accel.sh@21 -- # val= 00:07:11.085 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.085 16:15:39 -- accel/accel.sh@21 -- # val= 00:07:11.085 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.085 16:15:39 -- accel/accel.sh@21 -- # val=0x1 00:07:11.085 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.085 16:15:39 -- accel/accel.sh@21 -- # val= 00:07:11.085 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.085 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val= 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val= 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val=software 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@23 -- # accel_module=software 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val=32 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val=32 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val=1 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val=No 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val= 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:11.086 16:15:39 -- accel/accel.sh@21 -- # val= 00:07:11.086 16:15:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # IFS=: 00:07:11.086 16:15:39 -- accel/accel.sh@20 -- # read -r var val 00:07:12.480 16:15:40 -- accel/accel.sh@21 -- # val= 00:07:12.480 16:15:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # IFS=: 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # read -r var val 00:07:12.480 16:15:40 -- accel/accel.sh@21 -- # val= 00:07:12.480 16:15:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # IFS=: 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # read -r var val 00:07:12.480 16:15:40 -- accel/accel.sh@21 -- # val= 00:07:12.480 16:15:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # IFS=: 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # read -r var val 00:07:12.480 16:15:40 -- accel/accel.sh@21 -- # val= 00:07:12.480 16:15:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # IFS=: 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # read -r var val 00:07:12.480 16:15:40 -- accel/accel.sh@21 -- # val= 00:07:12.480 16:15:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # IFS=: 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # read -r var val 00:07:12.480 16:15:40 -- accel/accel.sh@21 -- # val= 00:07:12.480 16:15:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # IFS=: 00:07:12.480 16:15:40 -- accel/accel.sh@20 -- # read -r var val 00:07:12.480 16:15:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:12.480 16:15:40 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:12.480 16:15:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.480 00:07:12.480 real 0m2.577s 00:07:12.480 user 0m2.337s 00:07:12.480 sys 0m0.251s 00:07:12.480 16:15:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.480 16:15:40 -- common/autotest_common.sh@10 -- # set +x 00:07:12.480 ************************************ 00:07:12.480 END TEST accel_dif_generate_copy 00:07:12.480 ************************************ 00:07:12.480 16:15:40 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:12.480 16:15:40 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.480 16:15:40 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:07:12.480 16:15:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:12.480 16:15:40 -- common/autotest_common.sh@10 -- # set +x 00:07:12.480 ************************************ 00:07:12.480 START TEST accel_comp 00:07:12.480 ************************************ 00:07:12.480 16:15:40 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.480 16:15:40 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.480 16:15:40 -- accel/accel.sh@17 -- # local accel_module 00:07:12.480 16:15:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.480 16:15:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.480 16:15:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.480 16:15:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.480 16:15:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.480 16:15:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.480 16:15:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.480 16:15:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.480 16:15:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.480 16:15:40 -- accel/accel.sh@42 -- # jq -r . 00:07:12.480 [2024-07-20 16:15:40.910693] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:12.480 [2024-07-20 16:15:40.910784] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2265092 ] 00:07:12.480 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.480 [2024-07-20 16:15:40.979372] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.480 [2024-07-20 16:15:41.014859] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.417 16:15:42 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:13.417 00:07:13.417 SPDK Configuration: 00:07:13.417 Core mask: 0x1 00:07:13.417 00:07:13.417 Accel Perf Configuration: 00:07:13.417 Workload Type: compress 00:07:13.417 Transfer size: 4096 bytes 00:07:13.417 Vector count 1 00:07:13.417 Module: software 00:07:13.417 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.417 Queue depth: 32 00:07:13.417 Allocate depth: 32 00:07:13.417 # threads/core: 1 00:07:13.417 Run time: 1 seconds 00:07:13.417 Verify: No 00:07:13.417 00:07:13.417 Running for 1 seconds... 00:07:13.417 00:07:13.417 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:13.417 ------------------------------------------------------------------------------------ 00:07:13.417 0,0 68064/s 283 MiB/s 0 0 00:07:13.417 ==================================================================================== 00:07:13.417 Total 68064/s 265 MiB/s 0 0' 00:07:13.417 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.417 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.417 16:15:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.417 16:15:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.417 16:15:42 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.417 16:15:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.417 16:15:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.417 16:15:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.417 16:15:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.417 16:15:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.417 16:15:42 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.417 16:15:42 -- accel/accel.sh@42 -- # jq -r . 00:07:13.417 [2024-07-20 16:15:42.200303] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:13.417 [2024-07-20 16:15:42.200393] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2265238 ] 00:07:13.676 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.676 [2024-07-20 16:15:42.268723] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.676 [2024-07-20 16:15:42.303834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val= 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val= 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val= 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val=0x1 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val= 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val= 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val=compress 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val= 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val=software 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@23 -- # accel_module=software 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val=32 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val=32 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val=1 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.676 16:15:42 -- accel/accel.sh@21 -- # val=No 00:07:13.676 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.676 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.677 16:15:42 -- accel/accel.sh@21 -- # val= 00:07:13.677 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.677 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.677 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:13.677 16:15:42 -- accel/accel.sh@21 -- # val= 00:07:13.677 16:15:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.677 16:15:42 -- accel/accel.sh@20 -- # IFS=: 00:07:13.677 16:15:42 -- accel/accel.sh@20 -- # read -r var val 00:07:15.052 16:15:43 -- accel/accel.sh@21 -- # val= 00:07:15.052 16:15:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.052 16:15:43 -- accel/accel.sh@20 -- # IFS=: 00:07:15.052 16:15:43 -- accel/accel.sh@20 -- # read -r var val 00:07:15.052 16:15:43 -- accel/accel.sh@21 -- # val= 00:07:15.052 16:15:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.052 16:15:43 -- accel/accel.sh@20 -- # IFS=: 00:07:15.052 16:15:43 -- accel/accel.sh@20 -- # read -r var val 00:07:15.053 16:15:43 -- accel/accel.sh@21 -- # val= 00:07:15.053 16:15:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.053 16:15:43 -- accel/accel.sh@20 -- # IFS=: 00:07:15.053 16:15:43 -- accel/accel.sh@20 -- # read -r var val 00:07:15.053 16:15:43 -- accel/accel.sh@21 -- # val= 00:07:15.053 16:15:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.053 16:15:43 -- accel/accel.sh@20 -- # IFS=: 00:07:15.053 16:15:43 -- accel/accel.sh@20 -- # read -r var val 00:07:15.053 16:15:43 -- accel/accel.sh@21 -- # val= 00:07:15.053 16:15:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.053 16:15:43 -- accel/accel.sh@20 -- # IFS=: 00:07:15.053 16:15:43 -- accel/accel.sh@20 -- # read -r var val 00:07:15.053 16:15:43 -- accel/accel.sh@21 -- # val= 00:07:15.053 16:15:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.053 16:15:43 -- accel/accel.sh@20 -- # IFS=: 00:07:15.053 16:15:43 -- accel/accel.sh@20 -- # read -r var val 00:07:15.053 16:15:43 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:15.053 16:15:43 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:15.053 16:15:43 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.053 00:07:15.053 real 0m2.583s 00:07:15.053 user 0m2.337s 00:07:15.053 sys 0m0.257s 00:07:15.053 16:15:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.053 16:15:43 -- common/autotest_common.sh@10 -- # set +x 00:07:15.053 ************************************ 00:07:15.053 END TEST accel_comp 00:07:15.053 ************************************ 00:07:15.053 16:15:43 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:15.053 16:15:43 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:15.053 16:15:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:15.053 16:15:43 -- common/autotest_common.sh@10 -- # set +x 00:07:15.053 ************************************ 00:07:15.053 START TEST accel_decomp 00:07:15.053 ************************************ 00:07:15.053 16:15:43 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:15.053 16:15:43 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.053 16:15:43 -- accel/accel.sh@17 -- # local accel_module 00:07:15.053 16:15:43 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:15.053 16:15:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:15.053 16:15:43 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.053 16:15:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.053 16:15:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.053 16:15:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.053 16:15:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.053 16:15:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.053 16:15:43 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.053 16:15:43 -- accel/accel.sh@42 -- # jq -r . 00:07:15.053 [2024-07-20 16:15:43.541683] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:15.053 [2024-07-20 16:15:43.541779] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2265488 ] 00:07:15.053 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.053 [2024-07-20 16:15:43.609916] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.053 [2024-07-20 16:15:43.645798] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.428 16:15:44 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:16.428 00:07:16.428 SPDK Configuration: 00:07:16.428 Core mask: 0x1 00:07:16.428 00:07:16.428 Accel Perf Configuration: 00:07:16.428 Workload Type: decompress 00:07:16.428 Transfer size: 4096 bytes 00:07:16.428 Vector count 1 00:07:16.428 Module: software 00:07:16.428 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.428 Queue depth: 32 00:07:16.428 Allocate depth: 32 00:07:16.428 # threads/core: 1 00:07:16.428 Run time: 1 seconds 00:07:16.428 Verify: Yes 00:07:16.428 00:07:16.428 Running for 1 seconds... 00:07:16.428 00:07:16.428 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:16.428 ------------------------------------------------------------------------------------ 00:07:16.428 0,0 94080/s 173 MiB/s 0 0 00:07:16.428 ==================================================================================== 00:07:16.428 Total 94080/s 367 MiB/s 0 0' 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:16.428 16:15:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:16.428 16:15:44 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.428 16:15:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.428 16:15:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.428 16:15:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.428 16:15:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.428 16:15:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.428 16:15:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.428 16:15:44 -- accel/accel.sh@42 -- # jq -r . 00:07:16.428 [2024-07-20 16:15:44.830202] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:16.428 [2024-07-20 16:15:44.830293] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2265760 ] 00:07:16.428 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.428 [2024-07-20 16:15:44.899074] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.428 [2024-07-20 16:15:44.933560] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val= 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val= 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val= 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val=0x1 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val= 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val= 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val=decompress 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val= 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val=software 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@23 -- # accel_module=software 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val=32 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val=32 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val=1 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val=Yes 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val= 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:16.428 16:15:44 -- accel/accel.sh@21 -- # val= 00:07:16.428 16:15:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # IFS=: 00:07:16.428 16:15:44 -- accel/accel.sh@20 -- # read -r var val 00:07:17.366 16:15:46 -- accel/accel.sh@21 -- # val= 00:07:17.366 16:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.366 16:15:46 -- accel/accel.sh@21 -- # val= 00:07:17.366 16:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.366 16:15:46 -- accel/accel.sh@21 -- # val= 00:07:17.366 16:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.366 16:15:46 -- accel/accel.sh@21 -- # val= 00:07:17.366 16:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.366 16:15:46 -- accel/accel.sh@21 -- # val= 00:07:17.366 16:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.366 16:15:46 -- accel/accel.sh@21 -- # val= 00:07:17.366 16:15:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # IFS=: 00:07:17.366 16:15:46 -- accel/accel.sh@20 -- # read -r var val 00:07:17.366 16:15:46 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:17.366 16:15:46 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:17.366 16:15:46 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.366 00:07:17.366 real 0m2.580s 00:07:17.366 user 0m2.331s 00:07:17.366 sys 0m0.259s 00:07:17.366 16:15:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.366 16:15:46 -- common/autotest_common.sh@10 -- # set +x 00:07:17.366 ************************************ 00:07:17.366 END TEST accel_decomp 00:07:17.366 ************************************ 00:07:17.366 16:15:46 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.366 16:15:46 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:17.366 16:15:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.366 16:15:46 -- common/autotest_common.sh@10 -- # set +x 00:07:17.366 ************************************ 00:07:17.366 START TEST accel_decmop_full 00:07:17.366 ************************************ 00:07:17.366 16:15:46 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.366 16:15:46 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.366 16:15:46 -- accel/accel.sh@17 -- # local accel_module 00:07:17.366 16:15:46 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.366 16:15:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.366 16:15:46 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.366 16:15:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.366 16:15:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.366 16:15:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.366 16:15:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.366 16:15:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.366 16:15:46 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.366 16:15:46 -- accel/accel.sh@42 -- # jq -r . 00:07:17.625 [2024-07-20 16:15:46.172804] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:17.625 [2024-07-20 16:15:46.172899] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2266043 ] 00:07:17.625 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.625 [2024-07-20 16:15:46.242102] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.625 [2024-07-20 16:15:46.277726] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.005 16:15:47 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:19.005 00:07:19.005 SPDK Configuration: 00:07:19.005 Core mask: 0x1 00:07:19.005 00:07:19.005 Accel Perf Configuration: 00:07:19.005 Workload Type: decompress 00:07:19.005 Transfer size: 111250 bytes 00:07:19.005 Vector count 1 00:07:19.005 Module: software 00:07:19.005 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.005 Queue depth: 32 00:07:19.005 Allocate depth: 32 00:07:19.005 # threads/core: 1 00:07:19.005 Run time: 1 seconds 00:07:19.005 Verify: Yes 00:07:19.005 00:07:19.005 Running for 1 seconds... 00:07:19.005 00:07:19.005 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:19.005 ------------------------------------------------------------------------------------ 00:07:19.005 0,0 5888/s 243 MiB/s 0 0 00:07:19.005 ==================================================================================== 00:07:19.005 Total 5888/s 624 MiB/s 0 0' 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:19.005 16:15:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:19.005 16:15:47 -- accel/accel.sh@12 -- # build_accel_config 00:07:19.005 16:15:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:19.005 16:15:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.005 16:15:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.005 16:15:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:19.005 16:15:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:19.005 16:15:47 -- accel/accel.sh@41 -- # local IFS=, 00:07:19.005 16:15:47 -- accel/accel.sh@42 -- # jq -r . 00:07:19.005 [2024-07-20 16:15:47.470264] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:19.005 [2024-07-20 16:15:47.470356] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2266309 ] 00:07:19.005 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.005 [2024-07-20 16:15:47.538926] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.005 [2024-07-20 16:15:47.573347] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val= 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val= 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val= 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val=0x1 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val= 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val= 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val=decompress 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val= 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val=software 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@23 -- # accel_module=software 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val=32 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val=32 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val=1 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val=Yes 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val= 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.005 16:15:47 -- accel/accel.sh@21 -- # val= 00:07:19.005 16:15:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # IFS=: 00:07:19.005 16:15:47 -- accel/accel.sh@20 -- # read -r var val 00:07:19.944 16:15:48 -- accel/accel.sh@21 -- # val= 00:07:19.944 16:15:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.944 16:15:48 -- accel/accel.sh@20 -- # IFS=: 00:07:19.944 16:15:48 -- accel/accel.sh@20 -- # read -r var val 00:07:19.944 16:15:48 -- accel/accel.sh@21 -- # val= 00:07:19.944 16:15:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.944 16:15:48 -- accel/accel.sh@20 -- # IFS=: 00:07:19.944 16:15:48 -- accel/accel.sh@20 -- # read -r var val 00:07:19.944 16:15:48 -- accel/accel.sh@21 -- # val= 00:07:19.944 16:15:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.944 16:15:48 -- accel/accel.sh@20 -- # IFS=: 00:07:19.944 16:15:48 -- accel/accel.sh@20 -- # read -r var val 00:07:19.944 16:15:48 -- accel/accel.sh@21 -- # val= 00:07:19.944 16:15:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.944 16:15:48 -- accel/accel.sh@20 -- # IFS=: 00:07:19.944 16:15:48 -- accel/accel.sh@20 -- # read -r var val 00:07:20.203 16:15:48 -- accel/accel.sh@21 -- # val= 00:07:20.203 16:15:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.203 16:15:48 -- accel/accel.sh@20 -- # IFS=: 00:07:20.203 16:15:48 -- accel/accel.sh@20 -- # read -r var val 00:07:20.203 16:15:48 -- accel/accel.sh@21 -- # val= 00:07:20.203 16:15:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.203 16:15:48 -- accel/accel.sh@20 -- # IFS=: 00:07:20.203 16:15:48 -- accel/accel.sh@20 -- # read -r var val 00:07:20.203 16:15:48 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:20.203 16:15:48 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:20.203 16:15:48 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.203 00:07:20.203 real 0m2.599s 00:07:20.203 user 0m2.352s 00:07:20.203 sys 0m0.255s 00:07:20.203 16:15:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.203 16:15:48 -- common/autotest_common.sh@10 -- # set +x 00:07:20.203 ************************************ 00:07:20.203 END TEST accel_decmop_full 00:07:20.203 ************************************ 00:07:20.203 16:15:48 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.203 16:15:48 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:20.203 16:15:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.203 16:15:48 -- common/autotest_common.sh@10 -- # set +x 00:07:20.203 ************************************ 00:07:20.203 START TEST accel_decomp_mcore 00:07:20.203 ************************************ 00:07:20.203 16:15:48 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.203 16:15:48 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.203 16:15:48 -- accel/accel.sh@17 -- # local accel_module 00:07:20.203 16:15:48 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.203 16:15:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.203 16:15:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.203 16:15:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.203 16:15:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.203 16:15:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.203 16:15:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.203 16:15:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.203 16:15:48 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.203 16:15:48 -- accel/accel.sh@42 -- # jq -r . 00:07:20.203 [2024-07-20 16:15:48.819601] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:20.203 [2024-07-20 16:15:48.819710] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2266592 ] 00:07:20.203 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.203 [2024-07-20 16:15:48.889131] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:20.203 [2024-07-20 16:15:48.927669] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:20.203 [2024-07-20 16:15:48.927765] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:20.203 [2024-07-20 16:15:48.927853] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:20.203 [2024-07-20 16:15:48.927854] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.580 16:15:50 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:21.580 00:07:21.580 SPDK Configuration: 00:07:21.580 Core mask: 0xf 00:07:21.580 00:07:21.580 Accel Perf Configuration: 00:07:21.580 Workload Type: decompress 00:07:21.580 Transfer size: 4096 bytes 00:07:21.580 Vector count 1 00:07:21.580 Module: software 00:07:21.580 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.580 Queue depth: 32 00:07:21.580 Allocate depth: 32 00:07:21.580 # threads/core: 1 00:07:21.580 Run time: 1 seconds 00:07:21.580 Verify: Yes 00:07:21.580 00:07:21.580 Running for 1 seconds... 00:07:21.580 00:07:21.580 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:21.580 ------------------------------------------------------------------------------------ 00:07:21.580 0,0 74016/s 136 MiB/s 0 0 00:07:21.580 3,0 78560/s 144 MiB/s 0 0 00:07:21.580 2,0 78272/s 144 MiB/s 0 0 00:07:21.580 1,0 78208/s 144 MiB/s 0 0 00:07:21.580 ==================================================================================== 00:07:21.580 Total 309056/s 1207 MiB/s 0 0' 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:21.580 16:15:50 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.580 16:15:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.580 16:15:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.580 16:15:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.580 16:15:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.580 16:15:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.580 16:15:50 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.580 16:15:50 -- accel/accel.sh@42 -- # jq -r . 00:07:21.580 [2024-07-20 16:15:50.109010] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:21.580 [2024-07-20 16:15:50.109076] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2266758 ] 00:07:21.580 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.580 [2024-07-20 16:15:50.172927] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:21.580 [2024-07-20 16:15:50.210463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:21.580 [2024-07-20 16:15:50.210522] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:21.580 [2024-07-20 16:15:50.210605] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:21.580 [2024-07-20 16:15:50.210606] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val= 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val= 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val= 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val=0xf 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val= 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val= 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val=decompress 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val= 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val=software 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@23 -- # accel_module=software 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val=32 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val=32 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val=1 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val=Yes 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val= 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:21.580 16:15:50 -- accel/accel.sh@21 -- # val= 00:07:21.580 16:15:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # IFS=: 00:07:21.580 16:15:50 -- accel/accel.sh@20 -- # read -r var val 00:07:22.963 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.963 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.963 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.963 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.963 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.963 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.963 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.963 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.963 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.963 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.963 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.963 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.963 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.964 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.964 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.964 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.964 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.964 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.964 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.964 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.964 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.964 16:15:51 -- accel/accel.sh@21 -- # val= 00:07:22.964 16:15:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.964 16:15:51 -- accel/accel.sh@20 -- # IFS=: 00:07:22.964 16:15:51 -- accel/accel.sh@20 -- # read -r var val 00:07:22.964 16:15:51 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:22.964 16:15:51 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:22.964 16:15:51 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:22.964 00:07:22.964 real 0m2.590s 00:07:22.964 user 0m8.984s 00:07:22.964 sys 0m0.269s 00:07:22.964 16:15:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.964 16:15:51 -- common/autotest_common.sh@10 -- # set +x 00:07:22.964 ************************************ 00:07:22.964 END TEST accel_decomp_mcore 00:07:22.964 ************************************ 00:07:22.964 16:15:51 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.964 16:15:51 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:22.964 16:15:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:22.964 16:15:51 -- common/autotest_common.sh@10 -- # set +x 00:07:22.964 ************************************ 00:07:22.964 START TEST accel_decomp_full_mcore 00:07:22.964 ************************************ 00:07:22.964 16:15:51 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.964 16:15:51 -- accel/accel.sh@16 -- # local accel_opc 00:07:22.964 16:15:51 -- accel/accel.sh@17 -- # local accel_module 00:07:22.964 16:15:51 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.964 16:15:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.964 16:15:51 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.964 16:15:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.964 16:15:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.964 16:15:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.964 16:15:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.964 16:15:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.964 16:15:51 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.964 16:15:51 -- accel/accel.sh@42 -- # jq -r . 00:07:22.964 [2024-07-20 16:15:51.459736] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:22.964 [2024-07-20 16:15:51.459825] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2266938 ] 00:07:22.964 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.964 [2024-07-20 16:15:51.529201] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:22.964 [2024-07-20 16:15:51.567781] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.964 [2024-07-20 16:15:51.567875] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:22.964 [2024-07-20 16:15:51.567983] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:22.964 [2024-07-20 16:15:51.567985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.340 16:15:52 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:24.340 00:07:24.340 SPDK Configuration: 00:07:24.340 Core mask: 0xf 00:07:24.340 00:07:24.340 Accel Perf Configuration: 00:07:24.340 Workload Type: decompress 00:07:24.340 Transfer size: 111250 bytes 00:07:24.340 Vector count 1 00:07:24.340 Module: software 00:07:24.340 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.340 Queue depth: 32 00:07:24.340 Allocate depth: 32 00:07:24.340 # threads/core: 1 00:07:24.340 Run time: 1 seconds 00:07:24.340 Verify: Yes 00:07:24.340 00:07:24.340 Running for 1 seconds... 00:07:24.340 00:07:24.340 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:24.340 ------------------------------------------------------------------------------------ 00:07:24.340 0,0 5792/s 239 MiB/s 0 0 00:07:24.340 3,0 5824/s 240 MiB/s 0 0 00:07:24.340 2,0 5792/s 239 MiB/s 0 0 00:07:24.340 1,0 5792/s 239 MiB/s 0 0 00:07:24.340 ==================================================================================== 00:07:24.340 Total 23200/s 2461 MiB/s 0 0' 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:24.340 16:15:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:24.340 16:15:52 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.340 16:15:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:24.340 16:15:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.340 16:15:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.340 16:15:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:24.340 16:15:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:24.340 16:15:52 -- accel/accel.sh@41 -- # local IFS=, 00:07:24.340 16:15:52 -- accel/accel.sh@42 -- # jq -r . 00:07:24.340 [2024-07-20 16:15:52.767967] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:24.340 [2024-07-20 16:15:52.768058] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2267180 ] 00:07:24.340 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.340 [2024-07-20 16:15:52.835513] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:24.340 [2024-07-20 16:15:52.873168] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.340 [2024-07-20 16:15:52.873268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.340 [2024-07-20 16:15:52.873355] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:24.340 [2024-07-20 16:15:52.873357] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val= 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val= 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val= 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val=0xf 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val= 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val= 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val=decompress 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val= 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val=software 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@23 -- # accel_module=software 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val=32 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val=32 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val=1 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val=Yes 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val= 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:24.340 16:15:52 -- accel/accel.sh@21 -- # val= 00:07:24.340 16:15:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # IFS=: 00:07:24.340 16:15:52 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@21 -- # val= 00:07:25.272 16:15:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # IFS=: 00:07:25.272 16:15:54 -- accel/accel.sh@20 -- # read -r var val 00:07:25.272 16:15:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:25.272 16:15:54 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:25.272 16:15:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.272 00:07:25.272 real 0m2.625s 00:07:25.272 user 0m9.064s 00:07:25.272 sys 0m0.279s 00:07:25.272 16:15:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.272 16:15:54 -- common/autotest_common.sh@10 -- # set +x 00:07:25.272 ************************************ 00:07:25.272 END TEST accel_decomp_full_mcore 00:07:25.272 ************************************ 00:07:25.572 16:15:54 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.572 16:15:54 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:25.572 16:15:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.572 16:15:54 -- common/autotest_common.sh@10 -- # set +x 00:07:25.572 ************************************ 00:07:25.572 START TEST accel_decomp_mthread 00:07:25.572 ************************************ 00:07:25.572 16:15:54 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.572 16:15:54 -- accel/accel.sh@16 -- # local accel_opc 00:07:25.572 16:15:54 -- accel/accel.sh@17 -- # local accel_module 00:07:25.572 16:15:54 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.572 16:15:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.572 16:15:54 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.572 16:15:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.572 16:15:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.572 16:15:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.572 16:15:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.572 16:15:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.572 16:15:54 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.572 16:15:54 -- accel/accel.sh@42 -- # jq -r . 00:07:25.572 [2024-07-20 16:15:54.133643] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:25.572 [2024-07-20 16:15:54.133738] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2267473 ] 00:07:25.572 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.572 [2024-07-20 16:15:54.203858] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.572 [2024-07-20 16:15:54.239868] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.993 16:15:55 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:26.993 00:07:26.993 SPDK Configuration: 00:07:26.993 Core mask: 0x1 00:07:26.993 00:07:26.993 Accel Perf Configuration: 00:07:26.993 Workload Type: decompress 00:07:26.993 Transfer size: 4096 bytes 00:07:26.993 Vector count 1 00:07:26.993 Module: software 00:07:26.993 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:26.993 Queue depth: 32 00:07:26.993 Allocate depth: 32 00:07:26.993 # threads/core: 2 00:07:26.993 Run time: 1 seconds 00:07:26.993 Verify: Yes 00:07:26.993 00:07:26.993 Running for 1 seconds... 00:07:26.993 00:07:26.993 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:26.993 ------------------------------------------------------------------------------------ 00:07:26.993 0,1 47040/s 86 MiB/s 0 0 00:07:26.993 0,0 46912/s 86 MiB/s 0 0 00:07:26.993 ==================================================================================== 00:07:26.993 Total 93952/s 367 MiB/s 0 0' 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:26.993 16:15:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:26.993 16:15:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.993 16:15:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:26.993 16:15:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.993 16:15:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.993 16:15:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:26.993 16:15:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:26.993 16:15:55 -- accel/accel.sh@41 -- # local IFS=, 00:07:26.993 16:15:55 -- accel/accel.sh@42 -- # jq -r . 00:07:26.993 [2024-07-20 16:15:55.425822] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:26.993 [2024-07-20 16:15:55.425915] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2267742 ] 00:07:26.993 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.993 [2024-07-20 16:15:55.494549] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.993 [2024-07-20 16:15:55.529092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val= 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val= 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val= 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val=0x1 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val= 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val= 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val=decompress 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val= 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val=software 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@23 -- # accel_module=software 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val=32 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val=32 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val=2 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val=Yes 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val= 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.993 16:15:55 -- accel/accel.sh@21 -- # val= 00:07:26.993 16:15:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # IFS=: 00:07:26.993 16:15:55 -- accel/accel.sh@20 -- # read -r var val 00:07:27.929 16:15:56 -- accel/accel.sh@21 -- # val= 00:07:27.929 16:15:56 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # IFS=: 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # read -r var val 00:07:27.929 16:15:56 -- accel/accel.sh@21 -- # val= 00:07:27.929 16:15:56 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # IFS=: 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # read -r var val 00:07:27.929 16:15:56 -- accel/accel.sh@21 -- # val= 00:07:27.929 16:15:56 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # IFS=: 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # read -r var val 00:07:27.929 16:15:56 -- accel/accel.sh@21 -- # val= 00:07:27.929 16:15:56 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # IFS=: 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # read -r var val 00:07:27.929 16:15:56 -- accel/accel.sh@21 -- # val= 00:07:27.929 16:15:56 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # IFS=: 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # read -r var val 00:07:27.929 16:15:56 -- accel/accel.sh@21 -- # val= 00:07:27.929 16:15:56 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # IFS=: 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # read -r var val 00:07:27.929 16:15:56 -- accel/accel.sh@21 -- # val= 00:07:27.929 16:15:56 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # IFS=: 00:07:27.929 16:15:56 -- accel/accel.sh@20 -- # read -r var val 00:07:27.929 16:15:56 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:27.929 16:15:56 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:27.929 16:15:56 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:27.929 00:07:27.929 real 0m2.590s 00:07:27.929 user 0m2.340s 00:07:27.929 sys 0m0.258s 00:07:27.929 16:15:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.929 16:15:56 -- common/autotest_common.sh@10 -- # set +x 00:07:27.929 ************************************ 00:07:27.929 END TEST accel_decomp_mthread 00:07:27.929 ************************************ 00:07:28.187 16:15:56 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:28.187 16:15:56 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:28.187 16:15:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:28.187 16:15:56 -- common/autotest_common.sh@10 -- # set +x 00:07:28.187 ************************************ 00:07:28.187 START TEST accel_deomp_full_mthread 00:07:28.187 ************************************ 00:07:28.187 16:15:56 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:28.187 16:15:56 -- accel/accel.sh@16 -- # local accel_opc 00:07:28.187 16:15:56 -- accel/accel.sh@17 -- # local accel_module 00:07:28.187 16:15:56 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:28.187 16:15:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:28.187 16:15:56 -- accel/accel.sh@12 -- # build_accel_config 00:07:28.187 16:15:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:28.187 16:15:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.187 16:15:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.187 16:15:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:28.187 16:15:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:28.187 16:15:56 -- accel/accel.sh@41 -- # local IFS=, 00:07:28.187 16:15:56 -- accel/accel.sh@42 -- # jq -r . 00:07:28.187 [2024-07-20 16:15:56.771019] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:28.187 [2024-07-20 16:15:56.771108] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2268023 ] 00:07:28.187 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.187 [2024-07-20 16:15:56.840832] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.187 [2024-07-20 16:15:56.876839] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.577 16:15:58 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:29.577 00:07:29.577 SPDK Configuration: 00:07:29.577 Core mask: 0x1 00:07:29.577 00:07:29.577 Accel Perf Configuration: 00:07:29.577 Workload Type: decompress 00:07:29.577 Transfer size: 111250 bytes 00:07:29.577 Vector count 1 00:07:29.577 Module: software 00:07:29.577 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:29.577 Queue depth: 32 00:07:29.577 Allocate depth: 32 00:07:29.577 # threads/core: 2 00:07:29.577 Run time: 1 seconds 00:07:29.577 Verify: Yes 00:07:29.577 00:07:29.577 Running for 1 seconds... 00:07:29.577 00:07:29.577 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:29.577 ------------------------------------------------------------------------------------ 00:07:29.577 0,1 3008/s 124 MiB/s 0 0 00:07:29.577 0,0 2976/s 122 MiB/s 0 0 00:07:29.577 ==================================================================================== 00:07:29.577 Total 5984/s 634 MiB/s 0 0' 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:29.577 16:15:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:29.577 16:15:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:29.577 16:15:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:29.577 16:15:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.577 16:15:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.577 16:15:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:29.577 16:15:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:29.577 16:15:58 -- accel/accel.sh@41 -- # local IFS=, 00:07:29.577 16:15:58 -- accel/accel.sh@42 -- # jq -r . 00:07:29.577 [2024-07-20 16:15:58.083622] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:29.577 [2024-07-20 16:15:58.083713] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2268279 ] 00:07:29.577 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.577 [2024-07-20 16:15:58.151765] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.577 [2024-07-20 16:15:58.186427] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val= 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val= 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val= 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val=0x1 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val= 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val= 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val=decompress 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val= 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val=software 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@23 -- # accel_module=software 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val=32 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val=32 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val=2 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val=Yes 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val= 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.577 16:15:58 -- accel/accel.sh@21 -- # val= 00:07:29.577 16:15:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # IFS=: 00:07:29.577 16:15:58 -- accel/accel.sh@20 -- # read -r var val 00:07:30.954 16:15:59 -- accel/accel.sh@21 -- # val= 00:07:30.954 16:15:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # IFS=: 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # read -r var val 00:07:30.954 16:15:59 -- accel/accel.sh@21 -- # val= 00:07:30.954 16:15:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # IFS=: 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # read -r var val 00:07:30.954 16:15:59 -- accel/accel.sh@21 -- # val= 00:07:30.954 16:15:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # IFS=: 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # read -r var val 00:07:30.954 16:15:59 -- accel/accel.sh@21 -- # val= 00:07:30.954 16:15:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # IFS=: 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # read -r var val 00:07:30.954 16:15:59 -- accel/accel.sh@21 -- # val= 00:07:30.954 16:15:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # IFS=: 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # read -r var val 00:07:30.954 16:15:59 -- accel/accel.sh@21 -- # val= 00:07:30.954 16:15:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # IFS=: 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # read -r var val 00:07:30.954 16:15:59 -- accel/accel.sh@21 -- # val= 00:07:30.954 16:15:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # IFS=: 00:07:30.954 16:15:59 -- accel/accel.sh@20 -- # read -r var val 00:07:30.954 16:15:59 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:30.954 16:15:59 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:30.954 16:15:59 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:30.954 00:07:30.954 real 0m2.626s 00:07:30.954 user 0m2.372s 00:07:30.954 sys 0m0.262s 00:07:30.954 16:15:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.954 16:15:59 -- common/autotest_common.sh@10 -- # set +x 00:07:30.954 ************************************ 00:07:30.954 END TEST accel_deomp_full_mthread 00:07:30.954 ************************************ 00:07:30.954 16:15:59 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:30.954 16:15:59 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:30.954 16:15:59 -- accel/accel.sh@129 -- # build_accel_config 00:07:30.954 16:15:59 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:30.954 16:15:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:30.954 16:15:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:30.954 16:15:59 -- common/autotest_common.sh@10 -- # set +x 00:07:30.954 16:15:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.954 16:15:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.954 16:15:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:30.954 16:15:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:30.954 16:15:59 -- accel/accel.sh@41 -- # local IFS=, 00:07:30.954 16:15:59 -- accel/accel.sh@42 -- # jq -r . 00:07:30.954 ************************************ 00:07:30.954 START TEST accel_dif_functional_tests 00:07:30.954 ************************************ 00:07:30.954 16:15:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:30.954 [2024-07-20 16:15:59.450054] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:30.954 [2024-07-20 16:15:59.450147] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2268470 ] 00:07:30.954 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.954 [2024-07-20 16:15:59.518400] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:30.954 [2024-07-20 16:15:59.555333] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:30.954 [2024-07-20 16:15:59.555429] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:30.954 [2024-07-20 16:15:59.555431] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.954 00:07:30.954 00:07:30.955 CUnit - A unit testing framework for C - Version 2.1-3 00:07:30.955 http://cunit.sourceforge.net/ 00:07:30.955 00:07:30.955 00:07:30.955 Suite: accel_dif 00:07:30.955 Test: verify: DIF generated, GUARD check ...passed 00:07:30.955 Test: verify: DIF generated, APPTAG check ...passed 00:07:30.955 Test: verify: DIF generated, REFTAG check ...passed 00:07:30.955 Test: verify: DIF not generated, GUARD check ...[2024-07-20 16:15:59.617712] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:30.955 [2024-07-20 16:15:59.617760] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:30.955 passed 00:07:30.955 Test: verify: DIF not generated, APPTAG check ...[2024-07-20 16:15:59.617810] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:30.955 [2024-07-20 16:15:59.617828] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:30.955 passed 00:07:30.955 Test: verify: DIF not generated, REFTAG check ...[2024-07-20 16:15:59.617848] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:30.955 [2024-07-20 16:15:59.617866] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:30.955 passed 00:07:30.955 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:30.955 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-20 16:15:59.617909] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:30.955 passed 00:07:30.955 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:30.955 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:30.955 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:30.955 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-20 16:15:59.618008] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:30.955 passed 00:07:30.955 Test: generate copy: DIF generated, GUARD check ...passed 00:07:30.955 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:30.955 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:30.955 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:30.955 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:30.955 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:30.955 Test: generate copy: iovecs-len validate ...[2024-07-20 16:15:59.618175] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:30.955 passed 00:07:30.955 Test: generate copy: buffer alignment validate ...passed 00:07:30.955 00:07:30.955 Run Summary: Type Total Ran Passed Failed Inactive 00:07:30.955 suites 1 1 n/a 0 0 00:07:30.955 tests 20 20 20 0 0 00:07:30.955 asserts 204 204 204 0 n/a 00:07:30.955 00:07:30.955 Elapsed time = 0.000 seconds 00:07:31.215 00:07:31.215 real 0m0.340s 00:07:31.215 user 0m0.516s 00:07:31.215 sys 0m0.159s 00:07:31.215 16:15:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.215 16:15:59 -- common/autotest_common.sh@10 -- # set +x 00:07:31.215 ************************************ 00:07:31.215 END TEST accel_dif_functional_tests 00:07:31.215 ************************************ 00:07:31.215 00:07:31.215 real 0m55.263s 00:07:31.215 user 1m2.820s 00:07:31.215 sys 0m7.159s 00:07:31.215 16:15:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.215 16:15:59 -- common/autotest_common.sh@10 -- # set +x 00:07:31.215 ************************************ 00:07:31.215 END TEST accel 00:07:31.215 ************************************ 00:07:31.215 16:15:59 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:31.215 16:15:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:31.215 16:15:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:31.215 16:15:59 -- common/autotest_common.sh@10 -- # set +x 00:07:31.215 ************************************ 00:07:31.215 START TEST accel_rpc 00:07:31.215 ************************************ 00:07:31.215 16:15:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:31.215 * Looking for test storage... 00:07:31.215 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:31.215 16:15:59 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:31.215 16:15:59 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2268647 00:07:31.215 16:15:59 -- accel/accel_rpc.sh@15 -- # waitforlisten 2268647 00:07:31.215 16:15:59 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:31.215 16:15:59 -- common/autotest_common.sh@819 -- # '[' -z 2268647 ']' 00:07:31.215 16:15:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.215 16:15:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:31.215 16:15:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.215 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.215 16:15:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:31.215 16:15:59 -- common/autotest_common.sh@10 -- # set +x 00:07:31.215 [2024-07-20 16:15:59.971584] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:31.215 [2024-07-20 16:15:59.971669] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2268647 ] 00:07:31.215 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.474 [2024-07-20 16:16:00.041146] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.474 [2024-07-20 16:16:00.079348] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.474 [2024-07-20 16:16:00.079489] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.474 16:16:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:31.474 16:16:00 -- common/autotest_common.sh@852 -- # return 0 00:07:31.474 16:16:00 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:31.474 16:16:00 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:31.474 16:16:00 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:31.474 16:16:00 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:31.474 16:16:00 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:31.474 16:16:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:31.474 16:16:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:31.474 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:31.474 ************************************ 00:07:31.474 START TEST accel_assign_opcode 00:07:31.474 ************************************ 00:07:31.474 16:16:00 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:07:31.474 16:16:00 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:31.474 16:16:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.474 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:31.474 [2024-07-20 16:16:00.139962] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:31.474 16:16:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.474 16:16:00 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:31.474 16:16:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.474 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:31.474 [2024-07-20 16:16:00.147966] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:31.474 16:16:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.474 16:16:00 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:31.474 16:16:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.474 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:31.733 16:16:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.733 16:16:00 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:31.733 16:16:00 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:31.733 16:16:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.733 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:31.733 16:16:00 -- accel/accel_rpc.sh@42 -- # grep software 00:07:31.733 16:16:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.733 software 00:07:31.733 00:07:31.733 real 0m0.223s 00:07:31.733 user 0m0.043s 00:07:31.733 sys 0m0.017s 00:07:31.733 16:16:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.733 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:31.733 ************************************ 00:07:31.733 END TEST accel_assign_opcode 00:07:31.733 ************************************ 00:07:31.733 16:16:00 -- accel/accel_rpc.sh@55 -- # killprocess 2268647 00:07:31.733 16:16:00 -- common/autotest_common.sh@926 -- # '[' -z 2268647 ']' 00:07:31.733 16:16:00 -- common/autotest_common.sh@930 -- # kill -0 2268647 00:07:31.733 16:16:00 -- common/autotest_common.sh@931 -- # uname 00:07:31.733 16:16:00 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:31.733 16:16:00 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2268647 00:07:31.733 16:16:00 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:31.733 16:16:00 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:31.733 16:16:00 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2268647' 00:07:31.733 killing process with pid 2268647 00:07:31.733 16:16:00 -- common/autotest_common.sh@945 -- # kill 2268647 00:07:31.733 16:16:00 -- common/autotest_common.sh@950 -- # wait 2268647 00:07:31.992 00:07:31.993 real 0m0.873s 00:07:31.993 user 0m0.760s 00:07:31.993 sys 0m0.435s 00:07:31.993 16:16:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.993 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:31.993 ************************************ 00:07:31.993 END TEST accel_rpc 00:07:31.993 ************************************ 00:07:31.993 16:16:00 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:31.993 16:16:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:31.993 16:16:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:31.993 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:31.993 ************************************ 00:07:31.993 START TEST app_cmdline 00:07:31.993 ************************************ 00:07:31.993 16:16:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:32.252 * Looking for test storage... 00:07:32.252 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:32.252 16:16:00 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:32.252 16:16:00 -- app/cmdline.sh@17 -- # spdk_tgt_pid=2268800 00:07:32.252 16:16:00 -- app/cmdline.sh@18 -- # waitforlisten 2268800 00:07:32.252 16:16:00 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:32.252 16:16:00 -- common/autotest_common.sh@819 -- # '[' -z 2268800 ']' 00:07:32.252 16:16:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.252 16:16:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:32.252 16:16:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.252 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.252 16:16:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:32.252 16:16:00 -- common/autotest_common.sh@10 -- # set +x 00:07:32.252 [2024-07-20 16:16:00.885438] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:32.252 [2024-07-20 16:16:00.885538] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2268800 ] 00:07:32.252 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.252 [2024-07-20 16:16:00.954350] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.252 [2024-07-20 16:16:00.991722] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:32.252 [2024-07-20 16:16:00.991835] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.185 16:16:01 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:33.185 16:16:01 -- common/autotest_common.sh@852 -- # return 0 00:07:33.185 16:16:01 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:33.185 { 00:07:33.185 "version": "SPDK v24.01.1-pre git sha1 4b94202c6", 00:07:33.185 "fields": { 00:07:33.185 "major": 24, 00:07:33.185 "minor": 1, 00:07:33.185 "patch": 1, 00:07:33.185 "suffix": "-pre", 00:07:33.185 "commit": "4b94202c6" 00:07:33.185 } 00:07:33.185 } 00:07:33.185 16:16:01 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:33.185 16:16:01 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:33.185 16:16:01 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:33.185 16:16:01 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:33.185 16:16:01 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:33.185 16:16:01 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:33.185 16:16:01 -- app/cmdline.sh@26 -- # sort 00:07:33.185 16:16:01 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:33.185 16:16:01 -- common/autotest_common.sh@10 -- # set +x 00:07:33.185 16:16:01 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:33.185 16:16:01 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:33.186 16:16:01 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:33.186 16:16:01 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:33.186 16:16:01 -- common/autotest_common.sh@640 -- # local es=0 00:07:33.186 16:16:01 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:33.186 16:16:01 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:33.186 16:16:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:33.186 16:16:01 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:33.186 16:16:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:33.186 16:16:01 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:33.186 16:16:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:33.186 16:16:01 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:33.186 16:16:01 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:33.186 16:16:01 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:33.445 request: 00:07:33.445 { 00:07:33.445 "method": "env_dpdk_get_mem_stats", 00:07:33.445 "req_id": 1 00:07:33.445 } 00:07:33.445 Got JSON-RPC error response 00:07:33.445 response: 00:07:33.445 { 00:07:33.445 "code": -32601, 00:07:33.445 "message": "Method not found" 00:07:33.445 } 00:07:33.445 16:16:02 -- common/autotest_common.sh@643 -- # es=1 00:07:33.445 16:16:02 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:33.445 16:16:02 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:33.445 16:16:02 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:33.445 16:16:02 -- app/cmdline.sh@1 -- # killprocess 2268800 00:07:33.445 16:16:02 -- common/autotest_common.sh@926 -- # '[' -z 2268800 ']' 00:07:33.445 16:16:02 -- common/autotest_common.sh@930 -- # kill -0 2268800 00:07:33.445 16:16:02 -- common/autotest_common.sh@931 -- # uname 00:07:33.445 16:16:02 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:33.445 16:16:02 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2268800 00:07:33.445 16:16:02 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:33.445 16:16:02 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:33.445 16:16:02 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2268800' 00:07:33.445 killing process with pid 2268800 00:07:33.445 16:16:02 -- common/autotest_common.sh@945 -- # kill 2268800 00:07:33.445 16:16:02 -- common/autotest_common.sh@950 -- # wait 2268800 00:07:33.704 00:07:33.704 real 0m1.639s 00:07:33.704 user 0m1.892s 00:07:33.704 sys 0m0.476s 00:07:33.704 16:16:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.704 16:16:02 -- common/autotest_common.sh@10 -- # set +x 00:07:33.704 ************************************ 00:07:33.704 END TEST app_cmdline 00:07:33.704 ************************************ 00:07:33.704 16:16:02 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:33.704 16:16:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:33.704 16:16:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:33.704 16:16:02 -- common/autotest_common.sh@10 -- # set +x 00:07:33.704 ************************************ 00:07:33.704 START TEST version 00:07:33.704 ************************************ 00:07:33.704 16:16:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:33.963 * Looking for test storage... 00:07:33.963 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:33.963 16:16:02 -- app/version.sh@17 -- # get_header_version major 00:07:33.963 16:16:02 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:33.963 16:16:02 -- app/version.sh@14 -- # cut -f2 00:07:33.963 16:16:02 -- app/version.sh@14 -- # tr -d '"' 00:07:33.963 16:16:02 -- app/version.sh@17 -- # major=24 00:07:33.963 16:16:02 -- app/version.sh@18 -- # get_header_version minor 00:07:33.963 16:16:02 -- app/version.sh@14 -- # cut -f2 00:07:33.963 16:16:02 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:33.963 16:16:02 -- app/version.sh@14 -- # tr -d '"' 00:07:33.963 16:16:02 -- app/version.sh@18 -- # minor=1 00:07:33.963 16:16:02 -- app/version.sh@19 -- # get_header_version patch 00:07:33.963 16:16:02 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:33.963 16:16:02 -- app/version.sh@14 -- # cut -f2 00:07:33.963 16:16:02 -- app/version.sh@14 -- # tr -d '"' 00:07:33.963 16:16:02 -- app/version.sh@19 -- # patch=1 00:07:33.963 16:16:02 -- app/version.sh@20 -- # get_header_version suffix 00:07:33.963 16:16:02 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:33.963 16:16:02 -- app/version.sh@14 -- # tr -d '"' 00:07:33.963 16:16:02 -- app/version.sh@14 -- # cut -f2 00:07:33.963 16:16:02 -- app/version.sh@20 -- # suffix=-pre 00:07:33.963 16:16:02 -- app/version.sh@22 -- # version=24.1 00:07:33.963 16:16:02 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:33.963 16:16:02 -- app/version.sh@25 -- # version=24.1.1 00:07:33.963 16:16:02 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:33.964 16:16:02 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:33.964 16:16:02 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:33.964 16:16:02 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:33.964 16:16:02 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:33.964 00:07:33.964 real 0m0.171s 00:07:33.964 user 0m0.084s 00:07:33.964 sys 0m0.124s 00:07:33.964 16:16:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.964 16:16:02 -- common/autotest_common.sh@10 -- # set +x 00:07:33.964 ************************************ 00:07:33.964 END TEST version 00:07:33.964 ************************************ 00:07:33.964 16:16:02 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@204 -- # uname -s 00:07:33.964 16:16:02 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:07:33.964 16:16:02 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:33.964 16:16:02 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:33.964 16:16:02 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@268 -- # timing_exit lib 00:07:33.964 16:16:02 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:33.964 16:16:02 -- common/autotest_common.sh@10 -- # set +x 00:07:33.964 16:16:02 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:33.964 16:16:02 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:33.964 16:16:02 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:33.964 16:16:02 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:33.964 16:16:02 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:33.964 16:16:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:33.964 16:16:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:33.964 16:16:02 -- common/autotest_common.sh@10 -- # set +x 00:07:33.964 ************************************ 00:07:33.964 START TEST llvm_fuzz 00:07:33.964 ************************************ 00:07:33.964 16:16:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:34.224 * Looking for test storage... 00:07:34.224 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:34.224 16:16:02 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:34.224 16:16:02 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:34.224 16:16:02 -- common/autotest_common.sh@538 -- # fuzzers=() 00:07:34.224 16:16:02 -- common/autotest_common.sh@538 -- # local fuzzers 00:07:34.224 16:16:02 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:07:34.224 16:16:02 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:34.224 16:16:02 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:34.224 16:16:02 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:34.224 16:16:02 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:34.224 16:16:02 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:34.224 16:16:02 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:34.224 16:16:02 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:34.224 16:16:02 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:34.224 16:16:02 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:34.224 16:16:02 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:34.224 16:16:02 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:34.224 16:16:02 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:34.224 16:16:02 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:34.224 16:16:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:34.224 16:16:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:34.224 16:16:02 -- common/autotest_common.sh@10 -- # set +x 00:07:34.224 ************************************ 00:07:34.224 START TEST nvmf_fuzz 00:07:34.224 ************************************ 00:07:34.224 16:16:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:34.224 * Looking for test storage... 00:07:34.224 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:34.224 16:16:02 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:34.224 16:16:02 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:34.224 16:16:02 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:34.224 16:16:02 -- common/autotest_common.sh@34 -- # set -e 00:07:34.224 16:16:02 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:34.224 16:16:02 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:34.224 16:16:02 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:34.224 16:16:02 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:34.224 16:16:02 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:34.224 16:16:02 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:34.224 16:16:02 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:34.224 16:16:02 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:34.224 16:16:02 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:34.224 16:16:02 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:34.224 16:16:02 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:34.224 16:16:02 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:34.224 16:16:02 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:34.224 16:16:02 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:34.224 16:16:02 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:34.224 16:16:02 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:34.224 16:16:02 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:34.224 16:16:02 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:34.224 16:16:02 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:34.224 16:16:02 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:34.224 16:16:02 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:34.224 16:16:02 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:34.224 16:16:02 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:34.224 16:16:02 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:34.224 16:16:02 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:34.224 16:16:02 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:34.224 16:16:02 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:34.224 16:16:02 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:34.224 16:16:02 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:34.224 16:16:02 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:34.224 16:16:02 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:34.224 16:16:02 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:34.224 16:16:02 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:34.224 16:16:02 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:34.224 16:16:02 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:34.224 16:16:02 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:34.224 16:16:02 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:34.224 16:16:02 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:34.224 16:16:02 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:34.224 16:16:02 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:34.224 16:16:02 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:34.224 16:16:02 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:34.224 16:16:02 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:34.224 16:16:02 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:34.224 16:16:02 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:34.224 16:16:02 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:34.224 16:16:02 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:34.224 16:16:02 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:34.224 16:16:02 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:34.224 16:16:02 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:34.224 16:16:02 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:34.225 16:16:02 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:34.225 16:16:02 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:34.225 16:16:02 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:34.225 16:16:02 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:34.225 16:16:02 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:34.225 16:16:02 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:34.225 16:16:02 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:34.225 16:16:02 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:34.225 16:16:02 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:34.225 16:16:02 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:34.225 16:16:02 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:34.225 16:16:02 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:34.225 16:16:02 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:34.225 16:16:02 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:34.225 16:16:02 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:34.225 16:16:02 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:34.225 16:16:02 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:34.225 16:16:02 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:34.225 16:16:02 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:34.225 16:16:02 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:34.225 16:16:02 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:34.225 16:16:02 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:34.225 16:16:02 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:34.225 16:16:02 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:34.225 16:16:02 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:34.225 16:16:02 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:34.225 16:16:02 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:34.225 16:16:02 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:34.225 16:16:02 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:34.225 16:16:02 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:34.225 16:16:02 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:34.225 16:16:02 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:34.225 16:16:02 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:34.225 16:16:02 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:34.225 16:16:02 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:34.225 16:16:02 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:34.225 16:16:02 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:34.225 16:16:02 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:34.225 16:16:02 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:34.225 16:16:02 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:34.225 16:16:02 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:34.225 16:16:02 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:34.225 16:16:02 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:34.225 16:16:02 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:34.225 16:16:02 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:34.225 16:16:02 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:34.225 16:16:02 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:34.225 16:16:02 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:34.225 #define SPDK_CONFIG_H 00:07:34.225 #define SPDK_CONFIG_APPS 1 00:07:34.225 #define SPDK_CONFIG_ARCH native 00:07:34.225 #undef SPDK_CONFIG_ASAN 00:07:34.225 #undef SPDK_CONFIG_AVAHI 00:07:34.225 #undef SPDK_CONFIG_CET 00:07:34.225 #define SPDK_CONFIG_COVERAGE 1 00:07:34.225 #define SPDK_CONFIG_CROSS_PREFIX 00:07:34.225 #undef SPDK_CONFIG_CRYPTO 00:07:34.225 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:34.225 #undef SPDK_CONFIG_CUSTOMOCF 00:07:34.225 #undef SPDK_CONFIG_DAOS 00:07:34.225 #define SPDK_CONFIG_DAOS_DIR 00:07:34.225 #define SPDK_CONFIG_DEBUG 1 00:07:34.225 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:34.225 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:34.225 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:34.225 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:34.225 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:34.225 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:34.225 #define SPDK_CONFIG_EXAMPLES 1 00:07:34.225 #undef SPDK_CONFIG_FC 00:07:34.225 #define SPDK_CONFIG_FC_PATH 00:07:34.225 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:34.225 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:34.225 #undef SPDK_CONFIG_FUSE 00:07:34.225 #define SPDK_CONFIG_FUZZER 1 00:07:34.225 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:34.225 #undef SPDK_CONFIG_GOLANG 00:07:34.225 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:34.225 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:34.225 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:34.225 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:34.225 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:34.225 #define SPDK_CONFIG_IDXD 1 00:07:34.225 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:34.225 #undef SPDK_CONFIG_IPSEC_MB 00:07:34.225 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:34.225 #define SPDK_CONFIG_ISAL 1 00:07:34.225 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:34.225 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:34.225 #define SPDK_CONFIG_LIBDIR 00:07:34.225 #undef SPDK_CONFIG_LTO 00:07:34.225 #define SPDK_CONFIG_MAX_LCORES 00:07:34.225 #define SPDK_CONFIG_NVME_CUSE 1 00:07:34.225 #undef SPDK_CONFIG_OCF 00:07:34.225 #define SPDK_CONFIG_OCF_PATH 00:07:34.225 #define SPDK_CONFIG_OPENSSL_PATH 00:07:34.225 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:34.225 #undef SPDK_CONFIG_PGO_USE 00:07:34.225 #define SPDK_CONFIG_PREFIX /usr/local 00:07:34.225 #undef SPDK_CONFIG_RAID5F 00:07:34.225 #undef SPDK_CONFIG_RBD 00:07:34.225 #define SPDK_CONFIG_RDMA 1 00:07:34.225 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:34.225 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:34.225 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:34.225 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:34.225 #undef SPDK_CONFIG_SHARED 00:07:34.225 #undef SPDK_CONFIG_SMA 00:07:34.225 #define SPDK_CONFIG_TESTS 1 00:07:34.225 #undef SPDK_CONFIG_TSAN 00:07:34.225 #define SPDK_CONFIG_UBLK 1 00:07:34.225 #define SPDK_CONFIG_UBSAN 1 00:07:34.225 #undef SPDK_CONFIG_UNIT_TESTS 00:07:34.225 #undef SPDK_CONFIG_URING 00:07:34.225 #define SPDK_CONFIG_URING_PATH 00:07:34.225 #undef SPDK_CONFIG_URING_ZNS 00:07:34.225 #undef SPDK_CONFIG_USDT 00:07:34.225 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:34.225 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:34.225 #define SPDK_CONFIG_VFIO_USER 1 00:07:34.225 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:34.225 #define SPDK_CONFIG_VHOST 1 00:07:34.225 #define SPDK_CONFIG_VIRTIO 1 00:07:34.225 #undef SPDK_CONFIG_VTUNE 00:07:34.225 #define SPDK_CONFIG_VTUNE_DIR 00:07:34.225 #define SPDK_CONFIG_WERROR 1 00:07:34.225 #define SPDK_CONFIG_WPDK_DIR 00:07:34.225 #undef SPDK_CONFIG_XNVME 00:07:34.225 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:34.225 16:16:02 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:34.225 16:16:02 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:34.225 16:16:02 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:34.225 16:16:02 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:34.225 16:16:02 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:34.225 16:16:02 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:34.225 16:16:02 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:34.225 16:16:02 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:34.225 16:16:02 -- paths/export.sh@5 -- # export PATH 00:07:34.225 16:16:02 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:34.225 16:16:02 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:34.225 16:16:02 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:34.225 16:16:02 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:34.225 16:16:02 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:34.225 16:16:02 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:34.225 16:16:03 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:34.225 16:16:03 -- pm/common@16 -- # TEST_TAG=N/A 00:07:34.225 16:16:03 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:34.225 16:16:03 -- common/autotest_common.sh@52 -- # : 1 00:07:34.225 16:16:03 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:34.225 16:16:03 -- common/autotest_common.sh@56 -- # : 0 00:07:34.225 16:16:03 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:34.225 16:16:03 -- common/autotest_common.sh@58 -- # : 0 00:07:34.225 16:16:03 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:34.225 16:16:03 -- common/autotest_common.sh@60 -- # : 1 00:07:34.225 16:16:03 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:34.225 16:16:03 -- common/autotest_common.sh@62 -- # : 0 00:07:34.225 16:16:03 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:34.225 16:16:03 -- common/autotest_common.sh@64 -- # : 00:07:34.226 16:16:03 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:34.226 16:16:03 -- common/autotest_common.sh@66 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:34.226 16:16:03 -- common/autotest_common.sh@68 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:34.226 16:16:03 -- common/autotest_common.sh@70 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:34.226 16:16:03 -- common/autotest_common.sh@72 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:34.226 16:16:03 -- common/autotest_common.sh@74 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:34.226 16:16:03 -- common/autotest_common.sh@76 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:34.226 16:16:03 -- common/autotest_common.sh@78 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:34.226 16:16:03 -- common/autotest_common.sh@80 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:34.226 16:16:03 -- common/autotest_common.sh@82 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:34.226 16:16:03 -- common/autotest_common.sh@84 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:34.226 16:16:03 -- common/autotest_common.sh@86 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:34.226 16:16:03 -- common/autotest_common.sh@88 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:34.226 16:16:03 -- common/autotest_common.sh@90 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:34.226 16:16:03 -- common/autotest_common.sh@92 -- # : 1 00:07:34.226 16:16:03 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:34.226 16:16:03 -- common/autotest_common.sh@94 -- # : 1 00:07:34.226 16:16:03 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:34.226 16:16:03 -- common/autotest_common.sh@96 -- # : rdma 00:07:34.226 16:16:03 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:34.226 16:16:03 -- common/autotest_common.sh@98 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:34.226 16:16:03 -- common/autotest_common.sh@100 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:34.226 16:16:03 -- common/autotest_common.sh@102 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:34.226 16:16:03 -- common/autotest_common.sh@104 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:34.226 16:16:03 -- common/autotest_common.sh@106 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:34.226 16:16:03 -- common/autotest_common.sh@108 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:34.226 16:16:03 -- common/autotest_common.sh@110 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:34.226 16:16:03 -- common/autotest_common.sh@112 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:34.226 16:16:03 -- common/autotest_common.sh@114 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:34.226 16:16:03 -- common/autotest_common.sh@116 -- # : 1 00:07:34.226 16:16:03 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:34.226 16:16:03 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:34.226 16:16:03 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:34.226 16:16:03 -- common/autotest_common.sh@120 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:34.226 16:16:03 -- common/autotest_common.sh@122 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:34.226 16:16:03 -- common/autotest_common.sh@124 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:34.226 16:16:03 -- common/autotest_common.sh@126 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:34.226 16:16:03 -- common/autotest_common.sh@128 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:34.226 16:16:03 -- common/autotest_common.sh@130 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:34.226 16:16:03 -- common/autotest_common.sh@132 -- # : v22.11.4 00:07:34.226 16:16:03 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:34.226 16:16:03 -- common/autotest_common.sh@134 -- # : true 00:07:34.226 16:16:03 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:34.226 16:16:03 -- common/autotest_common.sh@136 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:34.226 16:16:03 -- common/autotest_common.sh@138 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:34.226 16:16:03 -- common/autotest_common.sh@140 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:34.226 16:16:03 -- common/autotest_common.sh@142 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:34.226 16:16:03 -- common/autotest_common.sh@144 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:34.226 16:16:03 -- common/autotest_common.sh@146 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:34.226 16:16:03 -- common/autotest_common.sh@148 -- # : 00:07:34.226 16:16:03 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:34.226 16:16:03 -- common/autotest_common.sh@150 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:34.226 16:16:03 -- common/autotest_common.sh@152 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:34.226 16:16:03 -- common/autotest_common.sh@154 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:34.226 16:16:03 -- common/autotest_common.sh@156 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:34.226 16:16:03 -- common/autotest_common.sh@158 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:34.226 16:16:03 -- common/autotest_common.sh@160 -- # : 0 00:07:34.226 16:16:03 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:34.486 16:16:03 -- common/autotest_common.sh@163 -- # : 00:07:34.486 16:16:03 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:34.486 16:16:03 -- common/autotest_common.sh@165 -- # : 0 00:07:34.486 16:16:03 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:34.486 16:16:03 -- common/autotest_common.sh@167 -- # : 0 00:07:34.486 16:16:03 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:34.486 16:16:03 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:34.486 16:16:03 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:34.486 16:16:03 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:34.486 16:16:03 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:34.486 16:16:03 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:34.486 16:16:03 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:34.486 16:16:03 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:34.486 16:16:03 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:34.486 16:16:03 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:34.486 16:16:03 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:34.486 16:16:03 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:34.486 16:16:03 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:34.486 16:16:03 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:34.486 16:16:03 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:34.486 16:16:03 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:34.486 16:16:03 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:34.486 16:16:03 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:34.486 16:16:03 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:34.486 16:16:03 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:34.486 16:16:03 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:34.486 16:16:03 -- common/autotest_common.sh@196 -- # cat 00:07:34.486 16:16:03 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:34.486 16:16:03 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:34.486 16:16:03 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:34.486 16:16:03 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:34.486 16:16:03 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:34.486 16:16:03 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:34.486 16:16:03 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:34.486 16:16:03 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:34.486 16:16:03 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:34.486 16:16:03 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:34.486 16:16:03 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:34.486 16:16:03 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:34.486 16:16:03 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:34.486 16:16:03 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:34.486 16:16:03 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:34.486 16:16:03 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:34.486 16:16:03 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:34.486 16:16:03 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:34.486 16:16:03 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:34.486 16:16:03 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:34.486 16:16:03 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:34.486 16:16:03 -- common/autotest_common.sh@249 -- # valgrind= 00:07:34.486 16:16:03 -- common/autotest_common.sh@255 -- # uname -s 00:07:34.486 16:16:03 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:34.486 16:16:03 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:34.486 16:16:03 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:34.486 16:16:03 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:34.486 16:16:03 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:34.486 16:16:03 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:34.486 16:16:03 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:34.486 16:16:03 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:34.486 16:16:03 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:34.486 16:16:03 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:34.486 16:16:03 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:34.486 16:16:03 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:34.486 16:16:03 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:34.486 16:16:03 -- common/autotest_common.sh@309 -- # [[ -z 2269404 ]] 00:07:34.486 16:16:03 -- common/autotest_common.sh@309 -- # kill -0 2269404 00:07:34.486 16:16:03 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:34.486 16:16:03 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:34.486 16:16:03 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:34.486 16:16:03 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:34.486 16:16:03 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:34.486 16:16:03 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:34.486 16:16:03 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:34.486 16:16:03 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:34.486 16:16:03 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.FMXE7V 00:07:34.486 16:16:03 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:34.486 16:16:03 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:34.486 16:16:03 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:34.486 16:16:03 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.FMXE7V/tests/nvmf /tmp/spdk.FMXE7V 00:07:34.486 16:16:03 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:34.486 16:16:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:34.486 16:16:03 -- common/autotest_common.sh@318 -- # df -T 00:07:34.486 16:16:03 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:34.486 16:16:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:34.486 16:16:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:34.486 16:16:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:34.486 16:16:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:34.486 16:16:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:34.486 16:16:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:34.486 16:16:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:34.487 16:16:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:07:34.487 16:16:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=52145770496 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:07:34.487 16:16:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=9596547072 00:07:34.487 16:16:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:34.487 16:16:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:07:34.487 16:16:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342493184 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:07:34.487 16:16:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=5971968 00:07:34.487 16:16:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868783104 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:34.487 16:16:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=2375680 00:07:34.487 16:16:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:34.487 16:16:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:07:34.487 16:16:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:07:34.487 16:16:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:07:34.487 16:16:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:34.487 16:16:03 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:34.487 * Looking for test storage... 00:07:34.487 16:16:03 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:34.487 16:16:03 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:34.487 16:16:03 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:34.487 16:16:03 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:34.487 16:16:03 -- common/autotest_common.sh@363 -- # mount=/ 00:07:34.487 16:16:03 -- common/autotest_common.sh@365 -- # target_space=52145770496 00:07:34.487 16:16:03 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:34.487 16:16:03 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:34.487 16:16:03 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:34.487 16:16:03 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:34.487 16:16:03 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:34.487 16:16:03 -- common/autotest_common.sh@372 -- # new_size=11811139584 00:07:34.487 16:16:03 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:34.487 16:16:03 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:34.487 16:16:03 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:34.487 16:16:03 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:34.487 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:34.487 16:16:03 -- common/autotest_common.sh@380 -- # return 0 00:07:34.487 16:16:03 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:34.487 16:16:03 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:34.487 16:16:03 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:34.487 16:16:03 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:34.487 16:16:03 -- common/autotest_common.sh@1672 -- # true 00:07:34.487 16:16:03 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:34.487 16:16:03 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:34.487 16:16:03 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:34.487 16:16:03 -- common/autotest_common.sh@27 -- # exec 00:07:34.487 16:16:03 -- common/autotest_common.sh@29 -- # exec 00:07:34.487 16:16:03 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:34.487 16:16:03 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:34.487 16:16:03 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:34.487 16:16:03 -- common/autotest_common.sh@18 -- # set -x 00:07:34.487 16:16:03 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:34.487 16:16:03 -- ../common.sh@8 -- # pids=() 00:07:34.487 16:16:03 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:34.487 16:16:03 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:34.487 16:16:03 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:34.487 16:16:03 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:34.487 16:16:03 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:34.487 16:16:03 -- nvmf/run.sh@61 -- # mem_size=512 00:07:34.487 16:16:03 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:34.487 16:16:03 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:34.487 16:16:03 -- ../common.sh@69 -- # local fuzz_num=25 00:07:34.487 16:16:03 -- ../common.sh@70 -- # local time=1 00:07:34.487 16:16:03 -- ../common.sh@72 -- # (( i = 0 )) 00:07:34.487 16:16:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.487 16:16:03 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:34.487 16:16:03 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:34.487 16:16:03 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.487 16:16:03 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.487 16:16:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:34.487 16:16:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:34.487 16:16:03 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:34.487 16:16:03 -- nvmf/run.sh@29 -- # port=4400 00:07:34.487 16:16:03 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:34.487 16:16:03 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:34.487 16:16:03 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.487 16:16:03 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:34.487 [2024-07-20 16:16:03.145904] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:34.487 [2024-07-20 16:16:03.145963] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2269449 ] 00:07:34.487 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.746 [2024-07-20 16:16:03.319936] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.746 [2024-07-20 16:16:03.339469] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:34.746 [2024-07-20 16:16:03.339607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.746 [2024-07-20 16:16:03.391148] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:34.746 [2024-07-20 16:16:03.407490] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:34.746 INFO: Running with entropic power schedule (0xFF, 100). 00:07:34.746 INFO: Seed: 320794090 00:07:34.746 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:34.746 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:34.746 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:34.746 INFO: A corpus is not provided, starting from an empty corpus 00:07:34.746 #2 INITED exec/s: 0 rss: 60Mb 00:07:34.746 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:34.746 This may also happen if the target rejected all inputs we tried so far 00:07:34.746 [2024-07-20 16:16:03.462600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:34.746 [2024-07-20 16:16:03.462630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.004 NEW_FUNC[1/669]: 0x491840 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:35.004 NEW_FUNC[2/669]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:35.004 #5 NEW cov: 11457 ft: 11457 corp: 2/128b lim: 320 exec/s: 0 rss: 68Mb L: 127/127 MS: 3 InsertByte-CMP-InsertRepeatedBytes- DE: ">\000\000\000\000\000\000\000"- 00:07:35.004 [2024-07-20 16:16:03.763445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.004 [2024-07-20 16:16:03.763484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.004 NEW_FUNC[1/2]: 0x12f39f0 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:07:35.004 NEW_FUNC[2/2]: 0x16ed4c0 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:07:35.004 #9 NEW cov: 11622 ft: 11972 corp: 3/216b lim: 320 exec/s: 0 rss: 68Mb L: 88/127 MS: 4 ShuffleBytes-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:35.004 [2024-07-20 16:16:03.803524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.004 [2024-07-20 16:16:03.803553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.262 #10 NEW cov: 11628 ft: 12312 corp: 4/304b lim: 320 exec/s: 0 rss: 68Mb L: 88/127 MS: 1 CopyPart- 00:07:35.262 [2024-07-20 16:16:03.843712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:53535353 cdw11:53535353 00:07:35.262 [2024-07-20 16:16:03.843739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.262 [2024-07-20 16:16:03.843803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.262 [2024-07-20 16:16:03.843817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.262 #16 NEW cov: 11713 ft: 12708 corp: 5/466b lim: 320 exec/s: 0 rss: 68Mb L: 162/162 MS: 1 InsertRepeatedBytes- 00:07:35.262 [2024-07-20 16:16:03.883732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.262 [2024-07-20 16:16:03.883758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.262 #19 NEW cov: 11713 ft: 12865 corp: 6/545b lim: 320 exec/s: 0 rss: 68Mb L: 79/162 MS: 3 PersAutoDict-CrossOver-CopyPart- DE: ">\000\000\000\000\000\000\000"- 00:07:35.262 [2024-07-20 16:16:03.913789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.262 [2024-07-20 16:16:03.913816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.262 #20 NEW cov: 11713 ft: 12991 corp: 7/641b lim: 320 exec/s: 0 rss: 68Mb L: 96/162 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:35.262 [2024-07-20 16:16:03.954076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:35.263 [2024-07-20 16:16:03.954102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.263 [2024-07-20 16:16:03.954161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:25ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.263 [2024-07-20 16:16:03.954175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.263 #21 NEW cov: 11713 ft: 13092 corp: 8/769b lim: 320 exec/s: 0 rss: 69Mb L: 128/162 MS: 1 InsertByte- 00:07:35.263 [2024-07-20 16:16:03.994057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:35.263 [2024-07-20 16:16:03.994083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.263 #22 NEW cov: 11713 ft: 13123 corp: 9/896b lim: 320 exec/s: 0 rss: 69Mb L: 127/162 MS: 1 ChangeBinInt- 00:07:35.263 [2024-07-20 16:16:04.034195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:35.263 [2024-07-20 16:16:04.034221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.263 #23 NEW cov: 11713 ft: 13173 corp: 10/1002b lim: 320 exec/s: 0 rss: 69Mb L: 106/162 MS: 1 EraseBytes- 00:07:35.521 [2024-07-20 16:16:04.074278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:35.521 [2024-07-20 16:16:04.074304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.521 #24 NEW cov: 11713 ft: 13245 corp: 11/1129b lim: 320 exec/s: 0 rss: 69Mb L: 127/162 MS: 1 ChangeByte- 00:07:35.521 [2024-07-20 16:16:04.114363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00003e00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.521 [2024-07-20 16:16:04.114393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.521 #25 NEW cov: 11713 ft: 13335 corp: 12/1233b lim: 320 exec/s: 0 rss: 69Mb L: 104/162 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:35.521 [2024-07-20 16:16:04.154552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.521 [2024-07-20 16:16:04.154580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.521 #26 NEW cov: 11713 ft: 13363 corp: 13/1312b lim: 320 exec/s: 0 rss: 69Mb L: 79/162 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:35.521 [2024-07-20 16:16:04.194690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.521 [2024-07-20 16:16:04.194716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.521 #27 NEW cov: 11713 ft: 13421 corp: 14/1391b lim: 320 exec/s: 0 rss: 69Mb L: 79/162 MS: 1 ShuffleBytes- 00:07:35.521 [2024-07-20 16:16:04.234762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.521 [2024-07-20 16:16:04.234789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.521 #28 NEW cov: 11713 ft: 13435 corp: 15/1479b lim: 320 exec/s: 0 rss: 69Mb L: 88/162 MS: 1 ShuffleBytes- 00:07:35.521 [2024-07-20 16:16:04.274963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.521 [2024-07-20 16:16:04.274988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.521 [2024-07-20 16:16:04.275043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:35.521 [2024-07-20 16:16:04.275056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.521 #29 NEW cov: 11714 ft: 13477 corp: 16/1623b lim: 320 exec/s: 0 rss: 69Mb L: 144/162 MS: 1 CopyPart- 00:07:35.521 [2024-07-20 16:16:04.315167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:35.521 [2024-07-20 16:16:04.315192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.521 [2024-07-20 16:16:04.315256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.521 [2024-07-20 16:16:04.315271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.521 [2024-07-20 16:16:04.315348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.521 [2024-07-20 16:16:04.315362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.786 #30 NEW cov: 11714 ft: 13719 corp: 17/1857b lim: 320 exec/s: 0 rss: 69Mb L: 234/234 MS: 1 CrossOver- 00:07:35.786 [2024-07-20 16:16:04.355107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.786 [2024-07-20 16:16:04.355133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.786 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:35.786 #31 NEW cov: 11737 ft: 13756 corp: 18/1936b lim: 320 exec/s: 0 rss: 69Mb L: 79/234 MS: 1 CrossOver- 00:07:35.786 [2024-07-20 16:16:04.395197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0000003e 00:07:35.786 [2024-07-20 16:16:04.395223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.786 #32 NEW cov: 11737 ft: 13772 corp: 19/2063b lim: 320 exec/s: 0 rss: 69Mb L: 127/234 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:35.786 [2024-07-20 16:16:04.435437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.786 [2024-07-20 16:16:04.435467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.786 [2024-07-20 16:16:04.435541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (30) qid:0 cid:5 nsid:30303030 cdw10:30303030 cdw11:30303030 00:07:35.786 [2024-07-20 16:16:04.435555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.786 #33 NEW cov: 11737 ft: 13807 corp: 20/2224b lim: 320 exec/s: 33 rss: 69Mb L: 161/234 MS: 1 InsertRepeatedBytes- 00:07:35.786 [2024-07-20 16:16:04.475461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.786 [2024-07-20 16:16:04.475487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.786 #34 NEW cov: 11737 ft: 13827 corp: 21/2303b lim: 320 exec/s: 34 rss: 70Mb L: 79/234 MS: 1 ChangeBinInt- 00:07:35.786 [2024-07-20 16:16:04.515646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.786 [2024-07-20 16:16:04.515673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.786 [2024-07-20 16:16:04.515732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (30) qid:0 cid:5 nsid:30303030 cdw10:30303030 cdw11:30303030 00:07:35.786 [2024-07-20 16:16:04.515746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.786 #35 NEW cov: 11737 ft: 13833 corp: 22/2464b lim: 320 exec/s: 35 rss: 70Mb L: 161/234 MS: 1 ChangeASCIIInt- 00:07:35.786 [2024-07-20 16:16:04.555673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:3e000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.786 [2024-07-20 16:16:04.555698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.786 #39 NEW cov: 11737 ft: 13849 corp: 23/2564b lim: 320 exec/s: 39 rss: 70Mb L: 100/234 MS: 4 EraseBytes-CopyPart-ShuffleBytes-CrossOver- 00:07:36.045 [2024-07-20 16:16:04.595801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.045 [2024-07-20 16:16:04.595827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.045 #40 NEW cov: 11737 ft: 13947 corp: 24/2643b lim: 320 exec/s: 40 rss: 70Mb L: 79/234 MS: 1 ShuffleBytes- 00:07:36.045 [2024-07-20 16:16:04.625864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.045 [2024-07-20 16:16:04.625889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.045 #41 NEW cov: 11737 ft: 13955 corp: 25/2722b lim: 320 exec/s: 41 rss: 70Mb L: 79/234 MS: 1 ChangeByte- 00:07:36.045 [2024-07-20 16:16:04.666200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.045 [2024-07-20 16:16:04.666228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.045 [2024-07-20 16:16:04.666293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.046 [2024-07-20 16:16:04.666307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.046 [2024-07-20 16:16:04.666369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.046 [2024-07-20 16:16:04.666383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.046 #42 NEW cov: 11737 ft: 14024 corp: 26/2957b lim: 320 exec/s: 42 rss: 70Mb L: 235/235 MS: 1 InsertByte- 00:07:36.046 [2024-07-20 16:16:04.706176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.046 [2024-07-20 16:16:04.706202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.046 [2024-07-20 16:16:04.706263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.046 [2024-07-20 16:16:04.706277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.046 #43 NEW cov: 11737 ft: 14100 corp: 27/3114b lim: 320 exec/s: 43 rss: 70Mb L: 157/235 MS: 1 InsertRepeatedBytes- 00:07:36.046 [2024-07-20 16:16:04.746352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.046 [2024-07-20 16:16:04.746377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.046 [2024-07-20 16:16:04.746446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.046 [2024-07-20 16:16:04.746460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.046 [2024-07-20 16:16:04.746523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.046 [2024-07-20 16:16:04.746537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.046 #44 NEW cov: 11737 ft: 14103 corp: 28/3349b lim: 320 exec/s: 44 rss: 70Mb L: 235/235 MS: 1 ShuffleBytes- 00:07:36.046 [2024-07-20 16:16:04.786494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:003effff cdw11:00000000 00:07:36.046 [2024-07-20 16:16:04.786519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.046 [2024-07-20 16:16:04.786579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.046 [2024-07-20 16:16:04.786593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.046 [2024-07-20 16:16:04.786652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff83ffff 00:07:36.046 [2024-07-20 16:16:04.786665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.046 #45 NEW cov: 11737 ft: 14143 corp: 29/3584b lim: 320 exec/s: 45 rss: 70Mb L: 235/235 MS: 1 CopyPart- 00:07:36.046 [2024-07-20 16:16:04.826495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.046 [2024-07-20 16:16:04.826522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.046 #46 NEW cov: 11737 ft: 14222 corp: 30/3711b lim: 320 exec/s: 46 rss: 70Mb L: 127/235 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:36.305 [2024-07-20 16:16:04.866599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0000003e 00:07:36.305 [2024-07-20 16:16:04.866626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.305 #47 NEW cov: 11737 ft: 14232 corp: 31/3835b lim: 320 exec/s: 47 rss: 70Mb L: 124/235 MS: 1 EraseBytes- 00:07:36.305 [2024-07-20 16:16:04.906856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.305 [2024-07-20 16:16:04.906883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.305 [2024-07-20 16:16:04.906941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.305 [2024-07-20 16:16:04.906955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.305 [2024-07-20 16:16:04.907017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:fffdffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.305 [2024-07-20 16:16:04.907031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.305 #48 NEW cov: 11737 ft: 14239 corp: 32/4069b lim: 320 exec/s: 48 rss: 70Mb L: 234/235 MS: 1 ChangeBinInt- 00:07:36.305 [2024-07-20 16:16:04.946936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:53535353 cdw11:53535353 00:07:36.305 [2024-07-20 16:16:04.946961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.306 [2024-07-20 16:16:04.947024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.306 [2024-07-20 16:16:04.947039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.306 #49 NEW cov: 11737 ft: 14260 corp: 33/4231b lim: 320 exec/s: 49 rss: 70Mb L: 162/235 MS: 1 ChangeByte- 00:07:36.306 [2024-07-20 16:16:04.987130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.306 [2024-07-20 16:16:04.987155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.306 [2024-07-20 16:16:04.987217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.306 [2024-07-20 16:16:04.987231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.306 [2024-07-20 16:16:04.987291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.306 [2024-07-20 16:16:04.987304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.306 #50 NEW cov: 11737 ft: 14301 corp: 34/4465b lim: 320 exec/s: 50 rss: 70Mb L: 234/235 MS: 1 CopyPart- 00:07:36.306 [2024-07-20 16:16:05.027190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:53535353 cdw11:53535353 00:07:36.306 [2024-07-20 16:16:05.027218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.306 [2024-07-20 16:16:05.027280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff 00:07:36.306 [2024-07-20 16:16:05.027294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.306 #51 NEW cov: 11737 ft: 14338 corp: 35/4627b lim: 320 exec/s: 51 rss: 70Mb L: 162/235 MS: 1 ChangeBinInt- 00:07:36.306 [2024-07-20 16:16:05.057263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.306 [2024-07-20 16:16:05.057289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.306 [2024-07-20 16:16:05.057343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:36.306 [2024-07-20 16:16:05.057357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.306 #52 NEW cov: 11737 ft: 14362 corp: 36/4787b lim: 320 exec/s: 52 rss: 70Mb L: 160/235 MS: 1 InsertRepeatedBytes- 00:07:36.306 [2024-07-20 16:16:05.097290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1f00000000000000 00:07:36.306 [2024-07-20 16:16:05.097315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.565 #53 NEW cov: 11737 ft: 14370 corp: 37/4867b lim: 320 exec/s: 53 rss: 70Mb L: 80/235 MS: 1 InsertByte- 00:07:36.565 [2024-07-20 16:16:05.137572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:003effff cdw11:00000000 00:07:36.565 [2024-07-20 16:16:05.137598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.565 [2024-07-20 16:16:05.137656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.565 [2024-07-20 16:16:05.137670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.565 [2024-07-20 16:16:05.137747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff83ffff 00:07:36.565 [2024-07-20 16:16:05.137761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.565 #54 NEW cov: 11737 ft: 14406 corp: 38/5102b lim: 320 exec/s: 54 rss: 70Mb L: 235/235 MS: 1 ShuffleBytes- 00:07:36.565 [2024-07-20 16:16:05.177497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.565 [2024-07-20 16:16:05.177525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.565 #55 NEW cov: 11737 ft: 14410 corp: 39/5190b lim: 320 exec/s: 55 rss: 70Mb L: 88/235 MS: 1 ChangeBit- 00:07:36.565 [2024-07-20 16:16:05.207804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:003effff cdw11:00000000 00:07:36.565 [2024-07-20 16:16:05.207829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.565 [2024-07-20 16:16:05.207888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.565 [2024-07-20 16:16:05.207902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.565 [2024-07-20 16:16:05.207965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff83ffff 00:07:36.565 [2024-07-20 16:16:05.207979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.565 #56 NEW cov: 11737 ft: 14435 corp: 40/5425b lim: 320 exec/s: 56 rss: 70Mb L: 235/235 MS: 1 ChangeByte- 00:07:36.565 [2024-07-20 16:16:05.247685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.565 [2024-07-20 16:16:05.247710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.565 #57 NEW cov: 11737 ft: 14448 corp: 41/5514b lim: 320 exec/s: 57 rss: 70Mb L: 89/235 MS: 1 InsertByte- 00:07:36.565 [2024-07-20 16:16:05.287884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (3e) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1f00000000000000 00:07:36.565 [2024-07-20 16:16:05.287912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.565 #58 NEW cov: 11737 ft: 14454 corp: 42/5596b lim: 320 exec/s: 58 rss: 70Mb L: 82/235 MS: 1 CMP- DE: "\000\014"- 00:07:36.565 [2024-07-20 16:16:05.327972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0c003e00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.565 [2024-07-20 16:16:05.327999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.565 #59 NEW cov: 11737 ft: 14459 corp: 43/5702b lim: 320 exec/s: 59 rss: 70Mb L: 106/235 MS: 1 PersAutoDict- DE: "\000\014"- 00:07:36.565 [2024-07-20 16:16:05.368185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.565 [2024-07-20 16:16:05.368212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.565 [2024-07-20 16:16:05.368270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (30) qid:0 cid:5 nsid:30303030 cdw10:30303030 cdw11:30303030 00:07:36.565 [2024-07-20 16:16:05.368285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.824 #60 NEW cov: 11737 ft: 14470 corp: 44/5864b lim: 320 exec/s: 60 rss: 70Mb L: 162/235 MS: 1 InsertByte- 00:07:36.824 [2024-07-20 16:16:05.408401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.824 [2024-07-20 16:16:05.408427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.824 [2024-07-20 16:16:05.408507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.824 [2024-07-20 16:16:05.408523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.824 [2024-07-20 16:16:05.408584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.824 [2024-07-20 16:16:05.408598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.824 #61 NEW cov: 11737 ft: 14487 corp: 45/6058b lim: 320 exec/s: 61 rss: 70Mb L: 194/235 MS: 1 CopyPart- 00:07:36.824 [2024-07-20 16:16:05.448406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.824 [2024-07-20 16:16:05.448432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.824 [2024-07-20 16:16:05.448499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:25ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.824 [2024-07-20 16:16:05.448513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.824 [2024-07-20 16:16:05.478505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:36.824 [2024-07-20 16:16:05.478531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.824 [2024-07-20 16:16:05.478591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:25ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:36.824 [2024-07-20 16:16:05.478605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.824 #63 NEW cov: 11737 ft: 14493 corp: 46/6186b lim: 320 exec/s: 31 rss: 70Mb L: 128/235 MS: 2 ShuffleBytes-ChangeBit- 00:07:36.824 #63 DONE cov: 11737 ft: 14493 corp: 46/6186b lim: 320 exec/s: 31 rss: 70Mb 00:07:36.824 ###### Recommended dictionary. ###### 00:07:36.824 ">\000\000\000\000\000\000\000" # Uses: 6 00:07:36.824 "\000\014" # Uses: 1 00:07:36.824 ###### End of recommended dictionary. ###### 00:07:36.824 Done 63 runs in 2 second(s) 00:07:36.824 16:16:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:36.824 16:16:05 -- ../common.sh@72 -- # (( i++ )) 00:07:36.824 16:16:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.824 16:16:05 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:36.824 16:16:05 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:36.824 16:16:05 -- nvmf/run.sh@24 -- # local timen=1 00:07:36.824 16:16:05 -- nvmf/run.sh@25 -- # local core=0x1 00:07:36.824 16:16:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:36.824 16:16:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:36.824 16:16:05 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:36.824 16:16:05 -- nvmf/run.sh@29 -- # port=4401 00:07:36.824 16:16:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:36.824 16:16:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:36.824 16:16:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:36.824 16:16:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:37.083 [2024-07-20 16:16:05.652574] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:37.083 [2024-07-20 16:16:05.652646] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2269884 ] 00:07:37.083 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.083 [2024-07-20 16:16:05.836069] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.083 [2024-07-20 16:16:05.856000] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.083 [2024-07-20 16:16:05.856141] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.342 [2024-07-20 16:16:05.907425] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.342 [2024-07-20 16:16:05.923762] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:37.342 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.342 INFO: Seed: 2836789867 00:07:37.342 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:37.342 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:37.342 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:37.342 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.342 #2 INITED exec/s: 0 rss: 59Mb 00:07:37.342 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.342 This may also happen if the target rejected all inputs we tried so far 00:07:37.342 [2024-07-20 16:16:05.968688] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:37.342 [2024-07-20 16:16:05.968888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.342 [2024-07-20 16:16:05.968918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.622 NEW_FUNC[1/671]: 0x492140 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:37.622 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:37.622 #6 NEW cov: 11568 ft: 11569 corp: 2/12b lim: 30 exec/s: 0 rss: 64Mb L: 11/11 MS: 4 CopyPart-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:37.622 [2024-07-20 16:16:06.320164] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:37.622 [2024-07-20 16:16:06.320348] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:37.622 [2024-07-20 16:16:06.320717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.622 [2024-07-20 16:16:06.320768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.622 [2024-07-20 16:16:06.320910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.622 [2024-07-20 16:16:06.320933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.622 #7 NEW cov: 11683 ft: 12573 corp: 3/24b lim: 30 exec/s: 0 rss: 65Mb L: 12/12 MS: 1 InsertByte- 00:07:37.622 [2024-07-20 16:16:06.370316] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:37.622 [2024-07-20 16:16:06.370503] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:37.622 [2024-07-20 16:16:06.370842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.622 [2024-07-20 16:16:06.370872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.622 [2024-07-20 16:16:06.370993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.622 [2024-07-20 16:16:06.371010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.622 #8 NEW cov: 11689 ft: 12966 corp: 4/36b lim: 30 exec/s: 0 rss: 65Mb L: 12/12 MS: 1 ShuffleBytes- 00:07:37.622 [2024-07-20 16:16:06.410412] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xd026 00:07:37.622 [2024-07-20 16:16:06.410583] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:37.623 [2024-07-20 16:16:06.410912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.623 [2024-07-20 16:16:06.410941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.623 [2024-07-20 16:16:06.411052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.623 [2024-07-20 16:16:06.411071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.881 #9 NEW cov: 11780 ft: 13251 corp: 5/49b lim: 30 exec/s: 0 rss: 65Mb L: 13/13 MS: 1 InsertByte- 00:07:37.881 [2024-07-20 16:16:06.450530] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (12292) > buf size (4096) 00:07:37.881 [2024-07-20 16:16:06.450684] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (16592) > len (4) 00:07:37.881 [2024-07-20 16:16:06.451004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.451034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.881 [2024-07-20 16:16:06.451161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.451180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.881 #10 NEW cov: 11793 ft: 13327 corp: 6/61b lim: 30 exec/s: 0 rss: 65Mb L: 12/13 MS: 1 ChangeBinInt- 00:07:37.881 [2024-07-20 16:16:06.490677] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:37.881 [2024-07-20 16:16:06.490836] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:37.881 [2024-07-20 16:16:06.491213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.491241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.881 [2024-07-20 16:16:06.491366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.491386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.881 #11 NEW cov: 11793 ft: 13395 corp: 7/75b lim: 30 exec/s: 0 rss: 65Mb L: 14/14 MS: 1 InsertByte- 00:07:37.881 [2024-07-20 16:16:06.531025] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (38916) > buf size (4096) 00:07:37.881 [2024-07-20 16:16:06.531341] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (53456) > len (4) 00:07:37.881 [2024-07-20 16:16:06.531508] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:37.881 [2024-07-20 16:16:06.531846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.531874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.881 [2024-07-20 16:16:06.531994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.532010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.881 [2024-07-20 16:16:06.532136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.532154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.881 [2024-07-20 16:16:06.532281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.532299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.881 #12 NEW cov: 11803 ft: 14037 corp: 8/101b lim: 30 exec/s: 0 rss: 65Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:37.881 [2024-07-20 16:16:06.570842] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:37.881 [2024-07-20 16:16:06.571007] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:37.881 [2024-07-20 16:16:06.571339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.571369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.881 [2024-07-20 16:16:06.571497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.571516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.881 #13 NEW cov: 11803 ft: 14090 corp: 9/113b lim: 30 exec/s: 0 rss: 65Mb L: 12/26 MS: 1 ChangeBit- 00:07:37.881 [2024-07-20 16:16:06.610994] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:37.881 [2024-07-20 16:16:06.611329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.881 [2024-07-20 16:16:06.611359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.882 #14 NEW cov: 11803 ft: 14112 corp: 10/124b lim: 30 exec/s: 0 rss: 65Mb L: 11/26 MS: 1 CopyPart- 00:07:37.882 [2024-07-20 16:16:06.651167] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:37.882 [2024-07-20 16:16:06.651521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.882 [2024-07-20 16:16:06.651551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.882 #15 NEW cov: 11803 ft: 14131 corp: 11/135b lim: 30 exec/s: 0 rss: 65Mb L: 11/26 MS: 1 ChangeBit- 00:07:38.140 [2024-07-20 16:16:06.691350] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.140 [2024-07-20 16:16:06.691529] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:38.140 [2024-07-20 16:16:06.691889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.140 [2024-07-20 16:16:06.691916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.141 [2024-07-20 16:16:06.692038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.692056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.141 #16 NEW cov: 11803 ft: 14166 corp: 12/147b lim: 30 exec/s: 0 rss: 65Mb L: 12/26 MS: 1 CopyPart- 00:07:38.141 [2024-07-20 16:16:06.731438] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.141 [2024-07-20 16:16:06.731782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.731811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.141 #22 NEW cov: 11803 ft: 14187 corp: 13/158b lim: 30 exec/s: 0 rss: 65Mb L: 11/26 MS: 1 CopyPart- 00:07:38.141 [2024-07-20 16:16:06.771375] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (826180) > buf size (4096) 00:07:38.141 [2024-07-20 16:16:06.771703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d083d0 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.771734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.141 #23 NEW cov: 11803 ft: 14209 corp: 14/169b lim: 30 exec/s: 0 rss: 65Mb L: 11/26 MS: 1 ChangeBinInt- 00:07:38.141 [2024-07-20 16:16:06.811669] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2f2f 00:07:38.141 [2024-07-20 16:16:06.811822] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (834752) > buf size (4096) 00:07:38.141 [2024-07-20 16:16:06.812142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.812172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.141 [2024-07-20 16:16:06.812295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2f2f832f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.812313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.141 #24 NEW cov: 11803 ft: 14244 corp: 15/181b lim: 30 exec/s: 0 rss: 65Mb L: 12/26 MS: 1 ChangeBinInt- 00:07:38.141 [2024-07-20 16:16:06.851726] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (56132) > buf size (4096) 00:07:38.141 [2024-07-20 16:16:06.851899] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:38.141 [2024-07-20 16:16:06.852230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:36d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.852261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.141 [2024-07-20 16:16:06.852379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.852397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.141 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.141 #25 NEW cov: 11826 ft: 14382 corp: 16/193b lim: 30 exec/s: 0 rss: 66Mb L: 12/26 MS: 1 ChangeBit- 00:07:38.141 [2024-07-20 16:16:06.891960] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (14340) > buf size (4096) 00:07:38.141 [2024-07-20 16:16:06.892120] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (16592) > len (4) 00:07:38.141 [2024-07-20 16:16:06.892452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.892482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.141 [2024-07-20 16:16:06.892598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.892617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.141 #26 NEW cov: 11826 ft: 14421 corp: 17/205b lim: 30 exec/s: 0 rss: 66Mb L: 12/26 MS: 1 ChangeBit- 00:07:38.141 [2024-07-20 16:16:06.942053] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.141 [2024-07-20 16:16:06.942379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.141 [2024-07-20 16:16:06.942409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.400 #27 NEW cov: 11826 ft: 14449 corp: 18/211b lim: 30 exec/s: 27 rss: 66Mb L: 6/26 MS: 1 EraseBytes- 00:07:38.400 [2024-07-20 16:16:06.982181] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (47940) > buf size (4096) 00:07:38.400 [2024-07-20 16:16:06.982510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2ed000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:06.982541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.400 #28 NEW cov: 11826 ft: 14486 corp: 19/222b lim: 30 exec/s: 28 rss: 66Mb L: 11/26 MS: 1 ChangeBit- 00:07:38.400 [2024-07-20 16:16:07.022321] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:38.400 [2024-07-20 16:16:07.022682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.022713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.400 #29 NEW cov: 11826 ft: 14505 corp: 20/233b lim: 30 exec/s: 29 rss: 66Mb L: 11/26 MS: 1 CopyPart- 00:07:38.400 [2024-07-20 16:16:07.062388] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.400 [2024-07-20 16:16:07.062747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.062776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.400 #30 NEW cov: 11826 ft: 14511 corp: 21/244b lim: 30 exec/s: 30 rss: 66Mb L: 11/26 MS: 1 ShuffleBytes- 00:07:38.400 [2024-07-20 16:16:07.102568] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.400 [2024-07-20 16:16:07.102918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.102949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.400 #31 NEW cov: 11826 ft: 14547 corp: 22/255b lim: 30 exec/s: 31 rss: 66Mb L: 11/26 MS: 1 ChangeBinInt- 00:07:38.400 [2024-07-20 16:16:07.142800] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008a8a 00:07:38.400 [2024-07-20 16:16:07.142975] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (666156) > buf size (4096) 00:07:38.400 [2024-07-20 16:16:07.143125] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (738116) > buf size (4096) 00:07:38.400 [2024-07-20 16:16:07.143452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:268a028a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.143482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.400 [2024-07-20 16:16:07.143609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8a8a028a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.143629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.400 [2024-07-20 16:16:07.143746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d0d002d0 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.143764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.400 #32 NEW cov: 11826 ft: 14774 corp: 23/278b lim: 30 exec/s: 32 rss: 66Mb L: 23/26 MS: 1 InsertRepeatedBytes- 00:07:38.400 [2024-07-20 16:16:07.182965] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.400 [2024-07-20 16:16:07.183117] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.400 [2024-07-20 16:16:07.183268] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.400 [2024-07-20 16:16:07.183621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.183653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.400 [2024-07-20 16:16:07.183777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d002d0 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.183798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.400 [2024-07-20 16:16:07.183919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.400 [2024-07-20 16:16:07.183936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.658 #33 NEW cov: 11826 ft: 14811 corp: 24/300b lim: 30 exec/s: 33 rss: 66Mb L: 22/26 MS: 1 InsertRepeatedBytes- 00:07:38.658 [2024-07-20 16:16:07.222470] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (826180) > buf size (4096) 00:07:38.658 [2024-07-20 16:16:07.222804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d083d0 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.658 [2024-07-20 16:16:07.222836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.658 #34 NEW cov: 11826 ft: 14828 corp: 25/311b lim: 30 exec/s: 34 rss: 66Mb L: 11/26 MS: 1 ChangeBinInt- 00:07:38.658 [2024-07-20 16:16:07.263312] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (38916) > buf size (4096) 00:07:38.658 [2024-07-20 16:16:07.263485] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (208) > len (4) 00:07:38.658 [2024-07-20 16:16:07.263633] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (53456) > len (4) 00:07:38.658 [2024-07-20 16:16:07.263793] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (213828) > buf size (4096) 00:07:38.658 [2024-07-20 16:16:07.264164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.658 [2024-07-20 16:16:07.264194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.658 [2024-07-20 16:16:07.264317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.658 [2024-07-20 16:16:07.264335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.658 [2024-07-20 16:16:07.264460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.658 [2024-07-20 16:16:07.264479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.658 [2024-07-20 16:16:07.264605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.658 [2024-07-20 16:16:07.264625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.658 #35 NEW cov: 11826 ft: 14848 corp: 26/337b lim: 30 exec/s: 35 rss: 66Mb L: 26/26 MS: 1 ShuffleBytes- 00:07:38.658 [2024-07-20 16:16:07.313468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.658 [2024-07-20 16:16:07.313497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.658 #36 NEW cov: 11826 ft: 14896 corp: 27/348b lim: 30 exec/s: 36 rss: 66Mb L: 11/26 MS: 1 ChangeBinInt- 00:07:38.658 [2024-07-20 16:16:07.353400] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (564036) > buf size (4096) 00:07:38.659 [2024-07-20 16:16:07.353583] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3e3e 00:07:38.659 [2024-07-20 16:16:07.353751] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.659 [2024-07-20 16:16:07.354100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d002d0 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.659 [2024-07-20 16:16:07.354132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.659 [2024-07-20 16:16:07.354245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.659 [2024-07-20 16:16:07.354264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.659 [2024-07-20 16:16:07.354386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.659 [2024-07-20 16:16:07.354404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.659 #37 NEW cov: 11826 ft: 14901 corp: 28/371b lim: 30 exec/s: 37 rss: 67Mb L: 23/26 MS: 1 InsertByte- 00:07:38.659 [2024-07-20 16:16:07.403504] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.659 [2024-07-20 16:16:07.403687] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (738116) > buf size (4096) 00:07:38.659 [2024-07-20 16:16:07.404047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.659 [2024-07-20 16:16:07.404077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.659 [2024-07-20 16:16:07.404194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d002d0 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.659 [2024-07-20 16:16:07.404212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.659 #38 NEW cov: 11826 ft: 14912 corp: 29/383b lim: 30 exec/s: 38 rss: 67Mb L: 12/26 MS: 1 CopyPart- 00:07:38.659 [2024-07-20 16:16:07.453580] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (826180) > buf size (4096) 00:07:38.659 [2024-07-20 16:16:07.453930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d0832b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.659 [2024-07-20 16:16:07.453961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.917 #39 NEW cov: 11826 ft: 14918 corp: 30/394b lim: 30 exec/s: 39 rss: 67Mb L: 11/26 MS: 1 ChangeByte- 00:07:38.917 [2024-07-20 16:16:07.493737] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.917 [2024-07-20 16:16:07.493912] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008a8a 00:07:38.917 [2024-07-20 16:16:07.494062] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008a8a 00:07:38.917 [2024-07-20 16:16:07.494218] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (141868) > buf size (4096) 00:07:38.917 [2024-07-20 16:16:07.494592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.494623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.917 [2024-07-20 16:16:07.494736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d08a028a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.494756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.917 [2024-07-20 16:16:07.494882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8a8a028a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.494899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.917 [2024-07-20 16:16:07.495021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8a8a008a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.495042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.917 #40 NEW cov: 11826 ft: 14925 corp: 31/420b lim: 30 exec/s: 40 rss: 67Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:38.917 [2024-07-20 16:16:07.533979] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (38964) > buf size (4096) 00:07:38.917 [2024-07-20 16:16:07.534163] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (787268) > buf size (4096) 00:07:38.917 [2024-07-20 16:16:07.534322] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (16384) > len (4) 00:07:38.917 [2024-07-20 16:16:07.534683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:260c0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.534715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.917 [2024-07-20 16:16:07.534834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00d083d0 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.534853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.917 [2024-07-20 16:16:07.534976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:000000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.534993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.917 #41 NEW cov: 11826 ft: 14948 corp: 32/439b lim: 30 exec/s: 41 rss: 67Mb L: 19/26 MS: 1 CrossOver- 00:07:38.917 [2024-07-20 16:16:07.574076] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (12292) > buf size (4096) 00:07:38.917 [2024-07-20 16:16:07.574246] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (53456) > len (4) 00:07:38.917 [2024-07-20 16:16:07.574607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.574637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.917 [2024-07-20 16:16:07.574775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.574795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.917 #42 NEW cov: 11826 ft: 14961 corp: 33/456b lim: 30 exec/s: 42 rss: 67Mb L: 17/26 MS: 1 CrossOver- 00:07:38.917 [2024-07-20 16:16:07.614209] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (826180) > buf size (4096) 00:07:38.917 [2024-07-20 16:16:07.614383] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (9168) > len (816) 00:07:38.917 [2024-07-20 16:16:07.614737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d083d0 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.614768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.917 [2024-07-20 16:16:07.614885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00cb00d4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.917 [2024-07-20 16:16:07.614903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.917 #43 NEW cov: 11826 ft: 14967 corp: 34/468b lim: 30 exec/s: 43 rss: 67Mb L: 12/26 MS: 1 InsertByte- 00:07:38.917 [2024-07-20 16:16:07.654165] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (47940) > buf size (4096) 00:07:38.917 [2024-07-20 16:16:07.654499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2ed000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-07-20 16:16:07.654529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.918 #44 NEW cov: 11826 ft: 14981 corp: 35/477b lim: 30 exec/s: 44 rss: 67Mb L: 9/26 MS: 1 EraseBytes- 00:07:38.918 [2024-07-20 16:16:07.694376] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:38.918 [2024-07-20 16:16:07.694556] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (246596) > buf size (4096) 00:07:38.918 [2024-07-20 16:16:07.694908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-07-20 16:16:07.694935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.918 [2024-07-20 16:16:07.695066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-07-20 16:16:07.695082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.918 #45 NEW cov: 11826 ft: 14988 corp: 36/489b lim: 30 exec/s: 45 rss: 67Mb L: 12/26 MS: 1 ChangeBit- 00:07:39.177 [2024-07-20 16:16:07.734551] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (563548) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.734737] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11268) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.735062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.735092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.177 [2024-07-20 16:16:07.735220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.735238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.177 #46 NEW cov: 11826 ft: 14992 corp: 37/504b lim: 30 exec/s: 46 rss: 67Mb L: 15/26 MS: 1 InsertRepeatedBytes- 00:07:39.177 [2024-07-20 16:16:07.774594] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300002f2f 00:07:39.177 [2024-07-20 16:16:07.774934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d08330 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.774963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.177 #47 NEW cov: 11826 ft: 15003 corp: 38/515b lim: 30 exec/s: 47 rss: 67Mb L: 11/26 MS: 1 ChangeBinInt- 00:07:39.177 [2024-07-20 16:16:07.804908] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (564036) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.805065] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3e3e 00:07:39.177 [2024-07-20 16:16:07.805211] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:39.177 [2024-07-20 16:16:07.805599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d002d0 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.805631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.177 [2024-07-20 16:16:07.805749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d0d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.805766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.177 [2024-07-20 16:16:07.805887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.805904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.177 #48 NEW cov: 11826 ft: 15011 corp: 39/538b lim: 30 exec/s: 48 rss: 67Mb L: 23/26 MS: 1 ChangeByte- 00:07:39.177 [2024-07-20 16:16:07.844904] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39748) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.845046] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45892) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.845363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:26d000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.845392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.177 [2024-07-20 16:16:07.845512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2cd000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.845529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.177 #49 NEW cov: 11826 ft: 15033 corp: 40/550b lim: 30 exec/s: 49 rss: 67Mb L: 12/26 MS: 1 InsertByte- 00:07:39.177 [2024-07-20 16:16:07.885030] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008a8a 00:07:39.177 [2024-07-20 16:16:07.885208] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (666156) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.885356] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (738116) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.885730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8b8a028a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.885760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.177 [2024-07-20 16:16:07.885880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8a8a028a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.885898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.177 [2024-07-20 16:16:07.886017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d0d002d0 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.886035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.177 #50 NEW cov: 11826 ft: 15042 corp: 41/573b lim: 30 exec/s: 50 rss: 67Mb L: 23/26 MS: 1 ChangeByte- 00:07:39.177 [2024-07-20 16:16:07.925153] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (12292) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.925316] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (53456) > len (4) 00:07:39.177 [2024-07-20 16:16:07.925641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.925671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.177 [2024-07-20 16:16:07.925797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.925815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.177 #51 NEW cov: 11826 ft: 15043 corp: 42/590b lim: 30 exec/s: 51 rss: 67Mb L: 17/26 MS: 1 CrossOver- 00:07:39.177 [2024-07-20 16:16:07.965296] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (38964) > buf size (4096) 00:07:39.177 [2024-07-20 16:16:07.965467] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (53312) > len (4) 00:07:39.177 [2024-07-20 16:16:07.965803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:260c0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.965833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.177 [2024-07-20 16:16:07.965955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.177 [2024-07-20 16:16:07.965972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.436 #52 NEW cov: 11826 ft: 15047 corp: 43/604b lim: 30 exec/s: 26 rss: 67Mb L: 14/26 MS: 1 EraseBytes- 00:07:39.437 #52 DONE cov: 11826 ft: 15047 corp: 43/604b lim: 30 exec/s: 26 rss: 67Mb 00:07:39.437 Done 52 runs in 2 second(s) 00:07:39.437 16:16:08 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:39.437 16:16:08 -- ../common.sh@72 -- # (( i++ )) 00:07:39.437 16:16:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.437 16:16:08 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:39.437 16:16:08 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:39.437 16:16:08 -- nvmf/run.sh@24 -- # local timen=1 00:07:39.437 16:16:08 -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.437 16:16:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:39.437 16:16:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:39.437 16:16:08 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:39.437 16:16:08 -- nvmf/run.sh@29 -- # port=4402 00:07:39.437 16:16:08 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:39.437 16:16:08 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:39.437 16:16:08 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.437 16:16:08 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:39.437 [2024-07-20 16:16:08.147305] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:39.437 [2024-07-20 16:16:08.147373] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2270276 ] 00:07:39.437 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.696 [2024-07-20 16:16:08.322395] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.696 [2024-07-20 16:16:08.341992] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.696 [2024-07-20 16:16:08.342132] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.696 [2024-07-20 16:16:08.393529] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:39.696 [2024-07-20 16:16:08.409825] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:39.696 INFO: Running with entropic power schedule (0xFF, 100). 00:07:39.696 INFO: Seed: 1028820399 00:07:39.696 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:39.696 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:39.696 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:39.696 INFO: A corpus is not provided, starting from an empty corpus 00:07:39.696 #2 INITED exec/s: 0 rss: 59Mb 00:07:39.696 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:39.696 This may also happen if the target rejected all inputs we tried so far 00:07:39.696 [2024-07-20 16:16:08.465079] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.696 [2024-07-20 16:16:08.465206] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.696 [2024-07-20 16:16:08.465324] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.696 [2024-07-20 16:16:08.465550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.696 [2024-07-20 16:16:08.465581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.696 [2024-07-20 16:16:08.465637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.696 [2024-07-20 16:16:08.465653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.696 [2024-07-20 16:16:08.465707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.696 [2024-07-20 16:16:08.465721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.696 [2024-07-20 16:16:08.465766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.696 [2024-07-20 16:16:08.465782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.954 NEW_FUNC[1/670]: 0x494b60 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:39.954 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:39.954 #4 NEW cov: 11519 ft: 11515 corp: 2/31b lim: 35 exec/s: 0 rss: 66Mb L: 30/30 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:40.213 [2024-07-20 16:16:08.775815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.775850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.213 #5 NEW cov: 11633 ft: 12555 corp: 3/44b lim: 35 exec/s: 0 rss: 66Mb L: 13/30 MS: 1 CrossOver- 00:07:40.213 [2024-07-20 16:16:08.815664] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.213 [2024-07-20 16:16:08.815880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.815909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.213 #10 NEW cov: 11639 ft: 12836 corp: 4/55b lim: 35 exec/s: 0 rss: 66Mb L: 11/30 MS: 5 ShuffleBytes-InsertByte-EraseBytes-CrossOver-InsertRepeatedBytes- 00:07:40.213 [2024-07-20 16:16:08.855942] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.213 [2024-07-20 16:16:08.856063] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.213 [2024-07-20 16:16:08.856178] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.213 [2024-07-20 16:16:08.856386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.856414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.213 [2024-07-20 16:16:08.856474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.856490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.213 [2024-07-20 16:16:08.856546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.856561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.213 [2024-07-20 16:16:08.856615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.856630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.213 #11 NEW cov: 11724 ft: 13101 corp: 5/85b lim: 35 exec/s: 0 rss: 66Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:40.213 [2024-07-20 16:16:08.895864] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.213 [2024-07-20 16:16:08.896076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.896103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.213 #12 NEW cov: 11724 ft: 13185 corp: 6/96b lim: 35 exec/s: 0 rss: 66Mb L: 11/30 MS: 1 CMP- DE: "\005\000\000\000"- 00:07:40.213 [2024-07-20 16:16:08.935985] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.213 [2024-07-20 16:16:08.936202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0000 cdw11:ff00ff06 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.936230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.213 #13 NEW cov: 11724 ft: 13344 corp: 7/107b lim: 35 exec/s: 0 rss: 66Mb L: 11/30 MS: 1 ChangeBinInt- 00:07:40.213 [2024-07-20 16:16:08.976205] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.213 [2024-07-20 16:16:08.976420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.976449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.213 [2024-07-20 16:16:08.976505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:32000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:08.976521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.213 #14 NEW cov: 11724 ft: 13705 corp: 8/121b lim: 35 exec/s: 0 rss: 67Mb L: 14/30 MS: 1 InsertByte- 00:07:40.213 [2024-07-20 16:16:09.016452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.213 [2024-07-20 16:16:09.016479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.472 #15 NEW cov: 11724 ft: 13780 corp: 9/134b lim: 35 exec/s: 0 rss: 67Mb L: 13/30 MS: 1 ChangeBinInt- 00:07:40.472 [2024-07-20 16:16:09.056679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:df00dfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.472 [2024-07-20 16:16:09.056704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.472 [2024-07-20 16:16:09.056758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:df0000df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.472 [2024-07-20 16:16:09.056772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.472 #16 NEW cov: 11724 ft: 13817 corp: 10/154b lim: 35 exec/s: 0 rss: 67Mb L: 20/30 MS: 1 InsertRepeatedBytes- 00:07:40.472 [2024-07-20 16:16:09.096462] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.472 [2024-07-20 16:16:09.096696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.472 [2024-07-20 16:16:09.096723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.472 #17 NEW cov: 11724 ft: 13855 corp: 11/165b lim: 35 exec/s: 0 rss: 67Mb L: 11/30 MS: 1 PersAutoDict- DE: "\005\000\000\000"- 00:07:40.472 [2024-07-20 16:16:09.136743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.472 [2024-07-20 16:16:09.136769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.472 #18 NEW cov: 11724 ft: 13914 corp: 12/178b lim: 35 exec/s: 0 rss: 67Mb L: 13/30 MS: 1 ChangeBit- 00:07:40.472 [2024-07-20 16:16:09.176861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:cc000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.472 [2024-07-20 16:16:09.176886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.472 #19 NEW cov: 11724 ft: 13943 corp: 13/191b lim: 35 exec/s: 0 rss: 67Mb L: 13/30 MS: 1 ChangeByte- 00:07:40.472 [2024-07-20 16:16:09.216995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.472 [2024-07-20 16:16:09.217020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.472 #20 NEW cov: 11724 ft: 13985 corp: 14/204b lim: 35 exec/s: 0 rss: 67Mb L: 13/30 MS: 1 ChangeBit- 00:07:40.472 [2024-07-20 16:16:09.257098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.472 [2024-07-20 16:16:09.257124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.730 #21 NEW cov: 11724 ft: 14003 corp: 15/217b lim: 35 exec/s: 0 rss: 67Mb L: 13/30 MS: 1 ChangeBit- 00:07:40.730 [2024-07-20 16:16:09.297226] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.730 [2024-07-20 16:16:09.297346] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.730 [2024-07-20 16:16:09.297464] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.730 [2024-07-20 16:16:09.297677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.730 [2024-07-20 16:16:09.297704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.730 [2024-07-20 16:16:09.297756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.730 [2024-07-20 16:16:09.297775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.730 [2024-07-20 16:16:09.297820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.730 [2024-07-20 16:16:09.297834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.730 [2024-07-20 16:16:09.297888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:f800f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.730 [2024-07-20 16:16:09.297903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.730 #22 NEW cov: 11724 ft: 14013 corp: 16/251b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:40.730 [2024-07-20 16:16:09.337307] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.730 [2024-07-20 16:16:09.337424] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.730 [2024-07-20 16:16:09.337651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.730 [2024-07-20 16:16:09.337678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.730 [2024-07-20 16:16:09.337735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.730 [2024-07-20 16:16:09.337751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.730 [2024-07-20 16:16:09.337807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.730 [2024-07-20 16:16:09.337823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.730 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:40.730 #23 NEW cov: 11747 ft: 14227 corp: 17/277b lim: 35 exec/s: 0 rss: 67Mb L: 26/34 MS: 1 EraseBytes- 00:07:40.730 [2024-07-20 16:16:09.377601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:df00dfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.730 [2024-07-20 16:16:09.377627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.730 [2024-07-20 16:16:09.377682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:df0000df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.731 [2024-07-20 16:16:09.377697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.731 #24 NEW cov: 11747 ft: 14256 corp: 18/297b lim: 35 exec/s: 0 rss: 67Mb L: 20/34 MS: 1 ChangeByte- 00:07:40.731 [2024-07-20 16:16:09.417610] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.731 [2024-07-20 16:16:09.417729] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.731 [2024-07-20 16:16:09.418031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.731 [2024-07-20 16:16:09.418057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.731 [2024-07-20 16:16:09.418113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.731 [2024-07-20 16:16:09.418127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.731 [2024-07-20 16:16:09.418184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:df0000df SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.731 [2024-07-20 16:16:09.418198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.731 [2024-07-20 16:16:09.418250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:000000df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.731 [2024-07-20 16:16:09.418263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.731 #25 NEW cov: 11747 ft: 14270 corp: 19/331b lim: 35 exec/s: 25 rss: 67Mb L: 34/34 MS: 1 CrossOver- 00:07:40.731 [2024-07-20 16:16:09.457697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2df200ff cdw11:41005bf6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.731 [2024-07-20 16:16:09.457722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.731 #26 NEW cov: 11747 ft: 14344 corp: 20/342b lim: 35 exec/s: 26 rss: 67Mb L: 11/34 MS: 1 CMP- DE: "\377-\362[\366A\262\236"- 00:07:40.731 [2024-07-20 16:16:09.497834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:0000000d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.731 [2024-07-20 16:16:09.497860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.731 #27 NEW cov: 11747 ft: 14359 corp: 21/355b lim: 35 exec/s: 27 rss: 67Mb L: 13/34 MS: 1 ChangeBinInt- 00:07:40.990 [2024-07-20 16:16:09.537748] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.990 [2024-07-20 16:16:09.537977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ca0005 cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.538006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.990 NEW_FUNC[1/1]: 0x10ebda0 in spdk_nvmf_ns_identify_iocs_specific /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:2843 00:07:40.990 #31 NEW cov: 11765 ft: 14392 corp: 22/362b lim: 35 exec/s: 31 rss: 67Mb L: 7/34 MS: 4 ChangeBit-InsertByte-PersAutoDict-InsertByte- DE: "\005\000\000\000"- 00:07:40.990 [2024-07-20 16:16:09.577878] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.990 [2024-07-20 16:16:09.578091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.578116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.990 #32 NEW cov: 11765 ft: 14405 corp: 23/374b lim: 35 exec/s: 32 rss: 67Mb L: 12/34 MS: 1 CopyPart- 00:07:40.990 [2024-07-20 16:16:09.618163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00b5000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.618189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.990 #33 NEW cov: 11765 ft: 14461 corp: 24/387b lim: 35 exec/s: 33 rss: 67Mb L: 13/34 MS: 1 ChangeByte- 00:07:40.990 [2024-07-20 16:16:09.658274] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.990 [2024-07-20 16:16:09.658394] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.990 [2024-07-20 16:16:09.658509] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.990 [2024-07-20 16:16:09.658718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.658747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.990 [2024-07-20 16:16:09.658802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.658818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.990 [2024-07-20 16:16:09.658870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.658885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.990 [2024-07-20 16:16:09.658938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.658953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.990 #34 NEW cov: 11765 ft: 14515 corp: 25/417b lim: 35 exec/s: 34 rss: 67Mb L: 30/34 MS: 1 ChangeBit- 00:07:40.990 [2024-07-20 16:16:09.698369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:df00000a cdw11:0000dfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.698394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.990 [2024-07-20 16:16:09.698451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:df0000df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.698466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.990 #35 NEW cov: 11765 ft: 14560 corp: 26/437b lim: 35 exec/s: 35 rss: 67Mb L: 20/34 MS: 1 ShuffleBytes- 00:07:40.990 [2024-07-20 16:16:09.738510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:0000cc00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.738535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.990 #36 NEW cov: 11765 ft: 14576 corp: 27/448b lim: 35 exec/s: 36 rss: 68Mb L: 11/34 MS: 1 EraseBytes- 00:07:40.990 [2024-07-20 16:16:09.778644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00b5000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.990 [2024-07-20 16:16:09.778670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.249 #37 NEW cov: 11765 ft: 14615 corp: 28/461b lim: 35 exec/s: 37 rss: 68Mb L: 13/34 MS: 1 ChangeByte- 00:07:41.249 [2024-07-20 16:16:09.818751] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.249 [2024-07-20 16:16:09.818875] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.249 [2024-07-20 16:16:09.819084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.249 [2024-07-20 16:16:09.819110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.249 [2024-07-20 16:16:09.819165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.249 [2024-07-20 16:16:09.819180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.249 [2024-07-20 16:16:09.819233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.249 [2024-07-20 16:16:09.819254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.249 #38 NEW cov: 11765 ft: 14634 corp: 29/487b lim: 35 exec/s: 38 rss: 68Mb L: 26/34 MS: 1 ChangeBinInt- 00:07:41.249 [2024-07-20 16:16:09.859002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.249 [2024-07-20 16:16:09.859027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.249 #39 NEW cov: 11765 ft: 14658 corp: 30/500b lim: 35 exec/s: 39 rss: 68Mb L: 13/34 MS: 1 ChangeBinInt- 00:07:41.249 [2024-07-20 16:16:09.899097] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.249 [2024-07-20 16:16:09.899234] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.249 [2024-07-20 16:16:09.899554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.249 [2024-07-20 16:16:09.899580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.249 [2024-07-20 16:16:09.899635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:20000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.250 [2024-07-20 16:16:09.899651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.250 [2024-07-20 16:16:09.899706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:df0000df SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.250 [2024-07-20 16:16:09.899722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.250 [2024-07-20 16:16:09.899778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:000000df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.250 [2024-07-20 16:16:09.899791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.250 #40 NEW cov: 11765 ft: 14665 corp: 31/534b lim: 35 exec/s: 40 rss: 68Mb L: 34/34 MS: 1 ChangeBit- 00:07:41.250 [2024-07-20 16:16:09.939251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.250 [2024-07-20 16:16:09.939277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.250 #41 NEW cov: 11765 ft: 14704 corp: 32/542b lim: 35 exec/s: 41 rss: 68Mb L: 8/34 MS: 1 EraseBytes- 00:07:41.250 [2024-07-20 16:16:09.979339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ff000a cdw11:5b002df2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.250 [2024-07-20 16:16:09.979366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.250 #42 NEW cov: 11765 ft: 14741 corp: 33/555b lim: 35 exec/s: 42 rss: 68Mb L: 13/34 MS: 1 PersAutoDict- DE: "\377-\362[\366A\262\236"- 00:07:41.250 [2024-07-20 16:16:10.019491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00b5000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.250 [2024-07-20 16:16:10.019517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.250 #43 NEW cov: 11765 ft: 14745 corp: 34/568b lim: 35 exec/s: 43 rss: 68Mb L: 13/34 MS: 1 ShuffleBytes- 00:07:41.509 [2024-07-20 16:16:10.059568] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.509 [2024-07-20 16:16:10.059806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:28000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.059835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.509 [2024-07-20 16:16:10.059893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.059911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.509 #44 NEW cov: 11765 ft: 14755 corp: 35/582b lim: 35 exec/s: 44 rss: 68Mb L: 14/34 MS: 1 InsertByte- 00:07:41.509 [2024-07-20 16:16:10.099711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff5b00f2 cdw11:41002df6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.099738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.509 #45 NEW cov: 11765 ft: 14768 corp: 36/593b lim: 35 exec/s: 45 rss: 68Mb L: 11/34 MS: 1 ShuffleBytes- 00:07:41.509 [2024-07-20 16:16:10.139619] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.509 [2024-07-20 16:16:10.139832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.139859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.509 #46 NEW cov: 11765 ft: 14822 corp: 37/605b lim: 35 exec/s: 46 rss: 68Mb L: 12/34 MS: 1 ShuffleBytes- 00:07:41.509 [2024-07-20 16:16:10.179924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff5b00f2 cdw11:f6002dc0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.179950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.509 #47 NEW cov: 11765 ft: 14824 corp: 38/617b lim: 35 exec/s: 47 rss: 68Mb L: 12/34 MS: 1 InsertByte- 00:07:41.509 [2024-07-20 16:16:10.220063] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.509 [2024-07-20 16:16:10.220177] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.509 [2024-07-20 16:16:10.220374] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.509 [2024-07-20 16:16:10.220589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0027000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.220615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.509 [2024-07-20 16:16:10.220666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.220682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.509 [2024-07-20 16:16:10.220734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:df000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.220750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.509 [2024-07-20 16:16:10.220800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:df0000df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.220813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.509 [2024-07-20 16:16:10.220866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.220884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.509 #48 NEW cov: 11765 ft: 14864 corp: 39/652b lim: 35 exec/s: 48 rss: 69Mb L: 35/35 MS: 1 InsertByte- 00:07:41.509 [2024-07-20 16:16:10.260118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4e1700aa cdw11:7f0054aa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.509 [2024-07-20 16:16:10.260143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.509 #49 NEW cov: 11765 ft: 14869 corp: 40/660b lim: 35 exec/s: 49 rss: 69Mb L: 8/35 MS: 1 CMP- DE: "\252N\027T\252\177\000\000"- 00:07:41.510 [2024-07-20 16:16:10.300260] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.510 [2024-07-20 16:16:10.300377] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.510 [2024-07-20 16:16:10.300492] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.510 [2024-07-20 16:16:10.300707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.510 [2024-07-20 16:16:10.300734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.510 [2024-07-20 16:16:10.300787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.510 [2024-07-20 16:16:10.300803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.510 [2024-07-20 16:16:10.300856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.510 [2024-07-20 16:16:10.300871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.510 [2024-07-20 16:16:10.300924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.510 [2024-07-20 16:16:10.300940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.770 #50 NEW cov: 11765 ft: 14879 corp: 41/688b lim: 35 exec/s: 50 rss: 69Mb L: 28/35 MS: 1 CopyPart- 00:07:41.770 [2024-07-20 16:16:10.340399] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.770 [2024-07-20 16:16:10.340616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2b00000a cdw11:df0000df SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.340642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.770 [2024-07-20 16:16:10.340697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dfdf00df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.340711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.770 [2024-07-20 16:16:10.340764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.340779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.770 #51 NEW cov: 11765 ft: 14888 corp: 42/709b lim: 35 exec/s: 51 rss: 69Mb L: 21/35 MS: 1 InsertByte- 00:07:41.770 [2024-07-20 16:16:10.380493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ca000008 cdw11:00000500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.380517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.770 #52 NEW cov: 11765 ft: 14896 corp: 43/716b lim: 35 exec/s: 52 rss: 69Mb L: 7/35 MS: 1 ShuffleBytes- 00:07:41.770 [2024-07-20 16:16:10.420606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ff000a cdw11:5b002df2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.420631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.770 #53 NEW cov: 11765 ft: 14903 corp: 44/729b lim: 35 exec/s: 53 rss: 69Mb L: 13/35 MS: 1 PersAutoDict- DE: "\377-\362[\366A\262\236"- 00:07:41.770 [2024-07-20 16:16:10.460667] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.770 [2024-07-20 16:16:10.460787] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.770 [2024-07-20 16:16:10.461112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.461139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.770 [2024-07-20 16:16:10.461194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.461208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.770 [2024-07-20 16:16:10.461265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.461279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.770 [2024-07-20 16:16:10.461332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:f8f800f8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.770 [2024-07-20 16:16:10.461345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.770 #54 NEW cov: 11765 ft: 14904 corp: 45/759b lim: 35 exec/s: 27 rss: 69Mb L: 30/35 MS: 1 EraseBytes- 00:07:41.770 #54 DONE cov: 11765 ft: 14904 corp: 45/759b lim: 35 exec/s: 27 rss: 69Mb 00:07:41.770 ###### Recommended dictionary. ###### 00:07:41.770 "\005\000\000\000" # Uses: 2 00:07:41.770 "\377-\362[\366A\262\236" # Uses: 2 00:07:41.770 "\252N\027T\252\177\000\000" # Uses: 0 00:07:41.770 ###### End of recommended dictionary. ###### 00:07:41.770 Done 54 runs in 2 second(s) 00:07:42.029 16:16:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:42.029 16:16:10 -- ../common.sh@72 -- # (( i++ )) 00:07:42.029 16:16:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.029 16:16:10 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:42.029 16:16:10 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:42.029 16:16:10 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.029 16:16:10 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.029 16:16:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:42.029 16:16:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:42.029 16:16:10 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:42.029 16:16:10 -- nvmf/run.sh@29 -- # port=4403 00:07:42.029 16:16:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:42.029 16:16:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:42.029 16:16:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.029 16:16:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:42.029 [2024-07-20 16:16:10.643525] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:42.029 [2024-07-20 16:16:10.643596] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2270823 ] 00:07:42.029 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.029 [2024-07-20 16:16:10.817755] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.288 [2024-07-20 16:16:10.837490] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.288 [2024-07-20 16:16:10.837631] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.288 [2024-07-20 16:16:10.888996] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:42.288 [2024-07-20 16:16:10.905332] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:42.288 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.288 INFO: Seed: 3522818832 00:07:42.288 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:42.288 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:42.288 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:42.288 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.288 #2 INITED exec/s: 0 rss: 59Mb 00:07:42.288 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.288 This may also happen if the target rejected all inputs we tried so far 00:07:42.288 [2024-07-20 16:16:10.950706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.288 [2024-07-20 16:16:10.950737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.546 NEW_FUNC[1/679]: 0x496830 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:42.546 NEW_FUNC[2/679]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:42.546 #10 NEW cov: 11739 ft: 11740 corp: 2/10b lim: 20 exec/s: 0 rss: 66Mb L: 9/9 MS: 3 CrossOver-CrossOver-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:42.546 #19 NEW cov: 11862 ft: 12559 corp: 3/25b lim: 20 exec/s: 0 rss: 66Mb L: 15/15 MS: 4 ShuffleBytes-ChangeBit-CrossOver-InsertRepeatedBytes- 00:07:42.546 [2024-07-20 16:16:11.301539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.546 [2024-07-20 16:16:11.301574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.546 #20 NEW cov: 11871 ft: 12908 corp: 4/40b lim: 20 exec/s: 0 rss: 66Mb L: 15/15 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:42.805 #21 NEW cov: 11956 ft: 13145 corp: 5/55b lim: 20 exec/s: 0 rss: 66Mb L: 15/15 MS: 1 CopyPart- 00:07:42.805 [2024-07-20 16:16:11.381789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.805 [2024-07-20 16:16:11.381817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.805 #22 NEW cov: 11956 ft: 13231 corp: 6/70b lim: 20 exec/s: 0 rss: 66Mb L: 15/15 MS: 1 ChangeBinInt- 00:07:42.805 [2024-07-20 16:16:11.422044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.805 [2024-07-20 16:16:11.422071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.805 #23 NEW cov: 11973 ft: 13536 corp: 7/87b lim: 20 exec/s: 0 rss: 66Mb L: 17/17 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:42.805 #27 NEW cov: 11973 ft: 13982 corp: 8/91b lim: 20 exec/s: 0 rss: 66Mb L: 4/17 MS: 4 CrossOver-InsertByte-CrossOver-InsertByte- 00:07:42.805 [2024-07-20 16:16:11.502114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.805 [2024-07-20 16:16:11.502142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.805 #28 NEW cov: 11973 ft: 14009 corp: 9/106b lim: 20 exec/s: 0 rss: 67Mb L: 15/17 MS: 1 ShuffleBytes- 00:07:42.805 [2024-07-20 16:16:11.542195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.805 [2024-07-20 16:16:11.542223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.805 #29 NEW cov: 11973 ft: 14094 corp: 10/121b lim: 20 exec/s: 0 rss: 67Mb L: 15/17 MS: 1 ChangeByte- 00:07:42.805 [2024-07-20 16:16:11.582438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.805 [2024-07-20 16:16:11.582469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.064 #30 NEW cov: 11973 ft: 14143 corp: 11/140b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:43.064 [2024-07-20 16:16:11.622573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.064 [2024-07-20 16:16:11.622599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.064 #31 NEW cov: 11973 ft: 14217 corp: 12/156b lim: 20 exec/s: 0 rss: 67Mb L: 16/19 MS: 1 InsertByte- 00:07:43.064 #34 NEW cov: 11973 ft: 14293 corp: 13/160b lim: 20 exec/s: 0 rss: 67Mb L: 4/19 MS: 3 EraseBytes-ChangeBit-InsertByte- 00:07:43.064 [2024-07-20 16:16:11.702670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.064 [2024-07-20 16:16:11.702697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.064 #35 NEW cov: 11973 ft: 14311 corp: 14/175b lim: 20 exec/s: 0 rss: 67Mb L: 15/19 MS: 1 ChangeBinInt- 00:07:43.064 [2024-07-20 16:16:11.742945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.064 [2024-07-20 16:16:11.742973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.064 #36 NEW cov: 11973 ft: 14333 corp: 15/192b lim: 20 exec/s: 0 rss: 67Mb L: 17/19 MS: 1 ShuffleBytes- 00:07:43.064 [2024-07-20 16:16:11.783085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.064 [2024-07-20 16:16:11.783112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.064 #37 NEW cov: 11973 ft: 14381 corp: 16/209b lim: 20 exec/s: 0 rss: 67Mb L: 17/19 MS: 1 ChangeByte- 00:07:43.064 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.064 #38 NEW cov: 11996 ft: 14414 corp: 17/219b lim: 20 exec/s: 0 rss: 67Mb L: 10/19 MS: 1 EraseBytes- 00:07:43.323 #39 NEW cov: 11996 ft: 14425 corp: 18/229b lim: 20 exec/s: 0 rss: 67Mb L: 10/19 MS: 1 EraseBytes- 00:07:43.323 #40 NEW cov: 11996 ft: 14449 corp: 19/242b lim: 20 exec/s: 0 rss: 67Mb L: 13/19 MS: 1 InsertRepeatedBytes- 00:07:43.323 [2024-07-20 16:16:11.943235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.323 [2024-07-20 16:16:11.943263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.323 #41 NEW cov: 11996 ft: 14491 corp: 20/251b lim: 20 exec/s: 41 rss: 67Mb L: 9/19 MS: 1 ChangeBit- 00:07:43.323 [2024-07-20 16:16:11.983485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.323 [2024-07-20 16:16:11.983516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.323 #42 NEW cov: 11996 ft: 14513 corp: 21/266b lim: 20 exec/s: 42 rss: 67Mb L: 15/19 MS: 1 ShuffleBytes- 00:07:43.323 #43 NEW cov: 11996 ft: 14525 corp: 22/276b lim: 20 exec/s: 43 rss: 67Mb L: 10/19 MS: 1 ShuffleBytes- 00:07:43.323 [2024-07-20 16:16:12.063849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.323 [2024-07-20 16:16:12.063876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.323 #44 NEW cov: 11996 ft: 14612 corp: 23/294b lim: 20 exec/s: 44 rss: 68Mb L: 18/19 MS: 1 InsertByte- 00:07:43.582 #45 NEW cov: 11996 ft: 14618 corp: 24/307b lim: 20 exec/s: 45 rss: 68Mb L: 13/19 MS: 1 ShuffleBytes- 00:07:43.582 [2024-07-20 16:16:12.144046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.582 [2024-07-20 16:16:12.144072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.582 #46 NEW cov: 11996 ft: 14635 corp: 25/323b lim: 20 exec/s: 46 rss: 68Mb L: 16/19 MS: 1 ChangeBinInt- 00:07:43.582 #47 NEW cov: 11996 ft: 14646 corp: 26/338b lim: 20 exec/s: 47 rss: 68Mb L: 15/19 MS: 1 ChangeBinInt- 00:07:43.582 [2024-07-20 16:16:12.224253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.582 [2024-07-20 16:16:12.224279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.582 [2024-07-20 16:16:12.224393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.582 [2024-07-20 16:16:12.224424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:1 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.582 #48 NEW cov: 11997 ft: 14908 corp: 27/355b lim: 20 exec/s: 48 rss: 68Mb L: 17/19 MS: 1 CrossOver- 00:07:43.582 [2024-07-20 16:16:12.264360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.582 [2024-07-20 16:16:12.264385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.582 [2024-07-20 16:16:12.264500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.582 [2024-07-20 16:16:12.264515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:1 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.582 #49 NEW cov: 11997 ft: 14918 corp: 28/371b lim: 20 exec/s: 49 rss: 68Mb L: 16/19 MS: 1 InsertByte- 00:07:43.582 [2024-07-20 16:16:12.304211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.582 [2024-07-20 16:16:12.304237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.582 #50 NEW cov: 11997 ft: 14954 corp: 29/380b lim: 20 exec/s: 50 rss: 68Mb L: 9/19 MS: 1 ChangeBit- 00:07:43.582 [2024-07-20 16:16:12.344654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.582 [2024-07-20 16:16:12.344681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.582 #51 NEW cov: 11997 ft: 14962 corp: 30/397b lim: 20 exec/s: 51 rss: 68Mb L: 17/19 MS: 1 CMP- DE: "\344\022T\242]\362.\000"- 00:07:43.841 #52 NEW cov: 11997 ft: 14999 corp: 31/409b lim: 20 exec/s: 52 rss: 68Mb L: 12/19 MS: 1 EraseBytes- 00:07:43.841 #53 NEW cov: 11997 ft: 15030 corp: 32/428b lim: 20 exec/s: 53 rss: 68Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:43.841 #54 NEW cov: 11997 ft: 15033 corp: 33/445b lim: 20 exec/s: 54 rss: 69Mb L: 17/19 MS: 1 InsertRepeatedBytes- 00:07:43.841 [2024-07-20 16:16:12.505071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.841 [2024-07-20 16:16:12.505097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.841 #55 NEW cov: 11997 ft: 15054 corp: 34/462b lim: 20 exec/s: 55 rss: 69Mb L: 17/19 MS: 1 ChangeByte- 00:07:43.841 #56 NEW cov: 11997 ft: 15067 corp: 35/478b lim: 20 exec/s: 56 rss: 69Mb L: 16/19 MS: 1 InsertByte- 00:07:43.841 [2024-07-20 16:16:12.585462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.841 [2024-07-20 16:16:12.585488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.841 #57 NEW cov: 11997 ft: 15125 corp: 36/498b lim: 20 exec/s: 57 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:07:43.841 [2024-07-20 16:16:12.625484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.841 [2024-07-20 16:16:12.625511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.100 #58 NEW cov: 11997 ft: 15155 corp: 37/515b lim: 20 exec/s: 58 rss: 69Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:44.100 [2024-07-20 16:16:12.665701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.100 [2024-07-20 16:16:12.665726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.100 #59 NEW cov: 11997 ft: 15214 corp: 38/535b lim: 20 exec/s: 59 rss: 69Mb L: 20/20 MS: 1 PersAutoDict- DE: "\344\022T\242]\362.\000"- 00:07:44.100 [2024-07-20 16:16:12.705487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.100 [2024-07-20 16:16:12.705514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.100 #60 NEW cov: 11997 ft: 15224 corp: 39/550b lim: 20 exec/s: 60 rss: 70Mb L: 15/20 MS: 1 CrossOver- 00:07:44.100 [2024-07-20 16:16:12.745611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.100 [2024-07-20 16:16:12.745637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.100 #61 NEW cov: 11997 ft: 15229 corp: 40/563b lim: 20 exec/s: 61 rss: 70Mb L: 13/20 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:44.100 #69 NEW cov: 11997 ft: 15235 corp: 41/569b lim: 20 exec/s: 69 rss: 70Mb L: 6/20 MS: 3 CopyPart-ChangeBinInt-CMP- DE: "\377\377\377\377"- 00:07:44.100 #70 NEW cov: 11997 ft: 15256 corp: 42/575b lim: 20 exec/s: 70 rss: 70Mb L: 6/20 MS: 1 ChangeBit- 00:07:44.100 [2024-07-20 16:16:12.856102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.100 [2024-07-20 16:16:12.856129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.100 #71 NEW cov: 11998 ft: 15292 corp: 43/594b lim: 20 exec/s: 71 rss: 70Mb L: 19/20 MS: 1 CopyPart- 00:07:44.100 [2024-07-20 16:16:12.896214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.100 [2024-07-20 16:16:12.896240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.359 #72 NEW cov: 11998 ft: 15297 corp: 44/612b lim: 20 exec/s: 72 rss: 70Mb L: 18/20 MS: 1 InsertByte- 00:07:44.359 #73 NEW cov: 11998 ft: 15310 corp: 45/622b lim: 20 exec/s: 36 rss: 70Mb L: 10/20 MS: 1 ChangeByte- 00:07:44.359 #73 DONE cov: 11998 ft: 15310 corp: 45/622b lim: 20 exec/s: 36 rss: 70Mb 00:07:44.359 ###### Recommended dictionary. ###### 00:07:44.359 "\000\000\000\000\000\000\000\000" # Uses: 3 00:07:44.359 "\344\022T\242]\362.\000" # Uses: 1 00:07:44.359 "\377\377\377\377" # Uses: 0 00:07:44.359 ###### End of recommended dictionary. ###### 00:07:44.359 Done 73 runs in 2 second(s) 00:07:44.359 16:16:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:44.359 16:16:13 -- ../common.sh@72 -- # (( i++ )) 00:07:44.359 16:16:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.359 16:16:13 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:44.359 16:16:13 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:44.359 16:16:13 -- nvmf/run.sh@24 -- # local timen=1 00:07:44.359 16:16:13 -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.359 16:16:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:44.359 16:16:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:44.359 16:16:13 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:44.359 16:16:13 -- nvmf/run.sh@29 -- # port=4404 00:07:44.359 16:16:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:44.359 16:16:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:44.359 16:16:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.359 16:16:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:44.359 [2024-07-20 16:16:13.115656] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:44.360 [2024-07-20 16:16:13.115738] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2271131 ] 00:07:44.360 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.617 [2024-07-20 16:16:13.297575] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.618 [2024-07-20 16:16:13.316779] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:44.618 [2024-07-20 16:16:13.316921] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.618 [2024-07-20 16:16:13.368340] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:44.618 [2024-07-20 16:16:13.384693] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:44.618 INFO: Running with entropic power schedule (0xFF, 100). 00:07:44.618 INFO: Seed: 1708855066 00:07:44.875 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:44.875 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:44.875 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:44.875 INFO: A corpus is not provided, starting from an empty corpus 00:07:44.875 #2 INITED exec/s: 0 rss: 59Mb 00:07:44.875 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:44.875 This may also happen if the target rejected all inputs we tried so far 00:07:44.875 [2024-07-20 16:16:13.461665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.875 [2024-07-20 16:16:13.461703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.875 [2024-07-20 16:16:13.461837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.875 [2024-07-20 16:16:13.461855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.876 [2024-07-20 16:16:13.461985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.876 [2024-07-20 16:16:13.462008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.876 [2024-07-20 16:16:13.462144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.876 [2024-07-20 16:16:13.462162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.134 NEW_FUNC[1/671]: 0x497920 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:45.134 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:45.134 #4 NEW cov: 11532 ft: 11533 corp: 2/32b lim: 35 exec/s: 0 rss: 66Mb L: 31/31 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:45.134 [2024-07-20 16:16:13.792494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.134 [2024-07-20 16:16:13.792544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.134 [2024-07-20 16:16:13.792673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1a1a001a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.134 [2024-07-20 16:16:13.792694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.134 [2024-07-20 16:16:13.792832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.134 [2024-07-20 16:16:13.792855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.134 [2024-07-20 16:16:13.792990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.134 [2024-07-20 16:16:13.793013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.134 #10 NEW cov: 11645 ft: 12159 corp: 3/66b lim: 35 exec/s: 0 rss: 66Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:45.134 [2024-07-20 16:16:13.842400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.134 [2024-07-20 16:16:13.842433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.134 [2024-07-20 16:16:13.842557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.134 [2024-07-20 16:16:13.842575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.134 [2024-07-20 16:16:13.842686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.134 [2024-07-20 16:16:13.842701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.134 [2024-07-20 16:16:13.842818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.134 [2024-07-20 16:16:13.842835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.135 #11 NEW cov: 11651 ft: 12483 corp: 4/99b lim: 35 exec/s: 0 rss: 66Mb L: 33/34 MS: 1 CopyPart- 00:07:45.135 [2024-07-20 16:16:13.882274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.135 [2024-07-20 16:16:13.882303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.135 [2024-07-20 16:16:13.882432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1a1a001a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.135 [2024-07-20 16:16:13.882453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.135 [2024-07-20 16:16:13.882575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.135 [2024-07-20 16:16:13.882591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.135 #12 NEW cov: 11736 ft: 13004 corp: 5/125b lim: 35 exec/s: 0 rss: 66Mb L: 26/34 MS: 1 EraseBytes- 00:07:45.135 [2024-07-20 16:16:13.932437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.135 [2024-07-20 16:16:13.932472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.135 [2024-07-20 16:16:13.932603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.135 [2024-07-20 16:16:13.932620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.135 [2024-07-20 16:16:13.932731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.135 [2024-07-20 16:16:13.932749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.394 #23 NEW cov: 11736 ft: 13164 corp: 6/151b lim: 35 exec/s: 0 rss: 66Mb L: 26/34 MS: 1 EraseBytes- 00:07:45.394 [2024-07-20 16:16:13.982519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:67673f67 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:13.982550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:13.982674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:13.982693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:13.982809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:13.982827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.394 #27 NEW cov: 11736 ft: 13224 corp: 7/174b lim: 35 exec/s: 0 rss: 66Mb L: 23/34 MS: 4 ShuffleBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:45.394 [2024-07-20 16:16:14.022533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.022562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.022689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1a1a001a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.022705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.022830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:fff7ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.022850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.022972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.022989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.394 #28 NEW cov: 11736 ft: 13345 corp: 8/208b lim: 35 exec/s: 0 rss: 66Mb L: 34/34 MS: 1 ChangeBinInt- 00:07:45.394 [2024-07-20 16:16:14.062631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.062662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.062791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.062809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.062923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.062950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.063074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.063093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.394 #29 NEW cov: 11736 ft: 13384 corp: 9/240b lim: 35 exec/s: 0 rss: 67Mb L: 32/34 MS: 1 CrossOver- 00:07:45.394 [2024-07-20 16:16:14.102447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1e1e251e cdw11:1e1e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.102477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.102607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1e1e1e1e cdw11:1e1e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.102624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.102744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1e1e1e1e cdw11:1e1e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.102762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.394 #37 NEW cov: 11736 ft: 13465 corp: 10/267b lim: 35 exec/s: 0 rss: 67Mb L: 27/34 MS: 3 ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:45.394 [2024-07-20 16:16:14.143394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:67673f67 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.143422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.143546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.143564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.143693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67676767 cdw11:67670000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.143713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.143838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.143855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.394 #38 NEW cov: 11736 ft: 13528 corp: 11/298b lim: 35 exec/s: 0 rss: 67Mb L: 31/34 MS: 1 CrossOver- 00:07:45.394 [2024-07-20 16:16:14.183258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.183288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.183411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1a1a001a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.183428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.394 [2024-07-20 16:16:14.183551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.394 [2024-07-20 16:16:14.183567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.659 #39 NEW cov: 11736 ft: 13567 corp: 12/324b lim: 35 exec/s: 0 rss: 67Mb L: 26/34 MS: 1 ChangeBit- 00:07:45.659 [2024-07-20 16:16:14.223629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:002e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.223658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.223777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.223796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.223915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.223932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.224052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.224069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.659 #40 NEW cov: 11736 ft: 13656 corp: 13/356b lim: 35 exec/s: 0 rss: 67Mb L: 32/34 MS: 1 ChangeByte- 00:07:45.659 [2024-07-20 16:16:14.263766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:67673f67 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.263793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.263917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.263937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.264062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67676767 cdw11:67670000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.264081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.264207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:97000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.264224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.659 #41 NEW cov: 11736 ft: 13679 corp: 14/388b lim: 35 exec/s: 0 rss: 67Mb L: 32/34 MS: 1 InsertByte- 00:07:45.659 [2024-07-20 16:16:14.303854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:67673f67 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.303881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.303997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:89676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.304014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.304142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67676767 cdw11:67670000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.304159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.304288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.304304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.659 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:45.659 #42 NEW cov: 11759 ft: 13774 corp: 15/419b lim: 35 exec/s: 0 rss: 67Mb L: 31/34 MS: 1 ChangeByte- 00:07:45.659 [2024-07-20 16:16:14.343903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.343931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.344065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.344085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.344204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.344220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.344336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.344352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.659 #43 NEW cov: 11759 ft: 13805 corp: 16/452b lim: 35 exec/s: 0 rss: 67Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:45.659 [2024-07-20 16:16:14.384076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3d002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.384105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.384235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.384256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.384382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.384399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.384516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.384532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.659 #44 NEW cov: 11759 ft: 13818 corp: 17/485b lim: 35 exec/s: 0 rss: 67Mb L: 33/34 MS: 1 InsertByte- 00:07:45.659 [2024-07-20 16:16:14.434185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.434213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.434329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ac0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.434347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.434461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.434489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.659 [2024-07-20 16:16:14.434605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.659 [2024-07-20 16:16:14.434621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.659 #45 NEW cov: 11759 ft: 13828 corp: 18/518b lim: 35 exec/s: 45 rss: 67Mb L: 33/34 MS: 1 InsertByte- 00:07:45.918 [2024-07-20 16:16:14.474062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.474090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.918 [2024-07-20 16:16:14.474218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.474238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.918 [2024-07-20 16:16:14.474359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.474376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.918 #46 NEW cov: 11759 ft: 13871 corp: 19/544b lim: 35 exec/s: 46 rss: 67Mb L: 26/34 MS: 1 EraseBytes- 00:07:45.918 [2024-07-20 16:16:14.514413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.514444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.918 [2024-07-20 16:16:14.514564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.514586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.918 [2024-07-20 16:16:14.514710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:fbfb00fb cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.514726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.918 [2024-07-20 16:16:14.514855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.514872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.918 #47 NEW cov: 11759 ft: 13878 corp: 20/573b lim: 35 exec/s: 47 rss: 67Mb L: 29/34 MS: 1 InsertRepeatedBytes- 00:07:45.918 [2024-07-20 16:16:14.554361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002872 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.554388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.918 [2024-07-20 16:16:14.554508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.918 [2024-07-20 16:16:14.554525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.554641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.554658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.919 #48 NEW cov: 11759 ft: 13897 corp: 21/600b lim: 35 exec/s: 48 rss: 68Mb L: 27/34 MS: 1 InsertByte- 00:07:45.919 [2024-07-20 16:16:14.594447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.594476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.594602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.594619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.594739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:d7000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.594764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.919 #49 NEW cov: 11759 ft: 13922 corp: 22/627b lim: 35 exec/s: 49 rss: 68Mb L: 27/34 MS: 1 InsertByte- 00:07:45.919 [2024-07-20 16:16:14.634860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:67673f67 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.634891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.635015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:678967da cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.635034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.635149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.635170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.635290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.635307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.919 #50 NEW cov: 11759 ft: 13937 corp: 23/659b lim: 35 exec/s: 50 rss: 68Mb L: 32/34 MS: 1 InsertByte- 00:07:45.919 [2024-07-20 16:16:14.685025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.685055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.685184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:002c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.685202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.685308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.685325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.919 [2024-07-20 16:16:14.685440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.919 [2024-07-20 16:16:14.685461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.919 #51 NEW cov: 11759 ft: 13985 corp: 24/692b lim: 35 exec/s: 51 rss: 68Mb L: 33/34 MS: 1 ChangeBit- 00:07:46.177 [2024-07-20 16:16:14.735029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002872 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.177 [2024-07-20 16:16:14.735059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.177 [2024-07-20 16:16:14.735178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.177 [2024-07-20 16:16:14.735196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.177 [2024-07-20 16:16:14.735320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.177 [2024-07-20 16:16:14.735338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.177 [2024-07-20 16:16:14.735454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:eeee00ee cdw11:ee000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.177 [2024-07-20 16:16:14.735472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.177 #52 NEW cov: 11759 ft: 13998 corp: 25/723b lim: 35 exec/s: 52 rss: 68Mb L: 31/34 MS: 1 InsertRepeatedBytes- 00:07:46.178 [2024-07-20 16:16:14.784314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:28000028 cdw11:00280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.784342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.178 #54 NEW cov: 11759 ft: 14744 corp: 26/730b lim: 35 exec/s: 54 rss: 68Mb L: 7/34 MS: 2 CrossOver-CopyPart- 00:07:46.178 [2024-07-20 16:16:14.825326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:003f2800 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.825354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.825484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.825502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.825625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67006767 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.825644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.825771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1a1a001a cdw11:00670000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.825788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.178 #55 NEW cov: 11759 ft: 14754 corp: 27/762b lim: 35 exec/s: 55 rss: 68Mb L: 32/34 MS: 1 CrossOver- 00:07:46.178 [2024-07-20 16:16:14.865468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:67673f67 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.865496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.865622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.865640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.865755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67676767 cdw11:67670000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.865782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.865906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.865922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.178 #56 NEW cov: 11759 ft: 14766 corp: 28/793b lim: 35 exec/s: 56 rss: 68Mb L: 31/34 MS: 1 ChangeBit- 00:07:46.178 [2024-07-20 16:16:14.905083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:9d9d009d cdw11:9d9d0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.905114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.905244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:9d9d9d9d cdw11:28280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.905263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.178 #57 NEW cov: 11759 ft: 14973 corp: 29/810b lim: 35 exec/s: 57 rss: 68Mb L: 17/34 MS: 1 InsertRepeatedBytes- 00:07:46.178 [2024-07-20 16:16:14.955801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00ab0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.955831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.955961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5ff20060 cdw11:2e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.955981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.956109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.956127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.178 [2024-07-20 16:16:14.956251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.178 [2024-07-20 16:16:14.956268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.178 #58 NEW cov: 11759 ft: 14978 corp: 30/841b lim: 35 exec/s: 58 rss: 68Mb L: 31/34 MS: 1 CMP- DE: "\253\236\000`_\362.\000"- 00:07:46.436 [2024-07-20 16:16:14.995789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.436 [2024-07-20 16:16:14.995820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.436 [2024-07-20 16:16:14.995947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.436 [2024-07-20 16:16:14.995964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.436 [2024-07-20 16:16:14.996085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.436 [2024-07-20 16:16:14.996104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.436 [2024-07-20 16:16:14.996226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.436 [2024-07-20 16:16:14.996245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.436 #59 NEW cov: 11759 ft: 14991 corp: 31/874b lim: 35 exec/s: 59 rss: 68Mb L: 33/34 MS: 1 CopyPart- 00:07:46.436 [2024-07-20 16:16:15.035965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.436 [2024-07-20 16:16:15.035996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.436 [2024-07-20 16:16:15.036120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.436 [2024-07-20 16:16:15.036139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.436 [2024-07-20 16:16:15.036250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.436 [2024-07-20 16:16:15.036267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.436 [2024-07-20 16:16:15.036387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ab9e0000 cdw11:00600002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.436 [2024-07-20 16:16:15.036404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.436 #60 NEW cov: 11759 ft: 15005 corp: 32/907b lim: 35 exec/s: 60 rss: 68Mb L: 33/34 MS: 1 PersAutoDict- DE: "\253\236\000`_\362.\000"- 00:07:46.436 [2024-07-20 16:16:15.086019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.086049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.086176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.086195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.086317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0200ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.086336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.086456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.086474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.437 #61 NEW cov: 11759 ft: 15017 corp: 33/939b lim: 35 exec/s: 61 rss: 68Mb L: 32/34 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\002"- 00:07:46.437 [2024-07-20 16:16:15.126460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:67673f67 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.126490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.126624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.126643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.126764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67676767 cdw11:67670000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.126781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.126900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.126919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.437 #62 NEW cov: 11759 ft: 15078 corp: 34/970b lim: 35 exec/s: 62 rss: 68Mb L: 31/34 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\002"- 00:07:46.437 [2024-07-20 16:16:15.176399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.176430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.176555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:002c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.176572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.176693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.176712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.176836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.176854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.437 #63 NEW cov: 11759 ft: 15160 corp: 35/1003b lim: 35 exec/s: 63 rss: 68Mb L: 33/34 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\002"- 00:07:46.437 [2024-07-20 16:16:15.226022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:635d009d cdw11:9d9d0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.226053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.437 [2024-07-20 16:16:15.226175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:9d9d9d9d cdw11:28280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.437 [2024-07-20 16:16:15.226193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.697 #64 NEW cov: 11759 ft: 15181 corp: 36/1020b lim: 35 exec/s: 64 rss: 69Mb L: 17/34 MS: 1 ChangeBinInt- 00:07:46.697 [2024-07-20 16:16:15.276660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.276690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.276822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.276840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.276959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.276976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.277096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ab9e0000 cdw11:00600002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.277114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.697 #65 NEW cov: 11759 ft: 15183 corp: 37/1052b lim: 35 exec/s: 65 rss: 69Mb L: 32/34 MS: 1 PersAutoDict- DE: "\253\236\000`_\362.\000"- 00:07:46.697 [2024-07-20 16:16:15.316623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.316653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.316784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:002c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.316802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.316935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.316952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.317075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.317094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.697 #66 NEW cov: 11759 ft: 15208 corp: 38/1085b lim: 35 exec/s: 66 rss: 69Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:46.697 [2024-07-20 16:16:15.366556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002800 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.366585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.366711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.366739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.366859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0200ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.366873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.367002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:000000f2 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.367018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.697 #67 NEW cov: 11759 ft: 15213 corp: 39/1118b lim: 35 exec/s: 67 rss: 69Mb L: 33/34 MS: 1 InsertByte- 00:07:46.697 [2024-07-20 16:16:15.406375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:67673f67 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.406406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.406514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff6767 cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.406531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.406653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:67676767 cdw11:67670002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.406671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.697 #68 NEW cov: 11759 ft: 15228 corp: 40/1145b lim: 35 exec/s: 68 rss: 69Mb L: 27/34 MS: 1 InsertRepeatedBytes- 00:07:46.697 [2024-07-20 16:16:15.446711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1e1e251e cdw11:1e1e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.446741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.446871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1e1e1e1e cdw11:1e1e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.446888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.697 [2024-07-20 16:16:15.447009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1e1e1e1e cdw11:1e1e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.697 [2024-07-20 16:16:15.447026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.697 #69 NEW cov: 11759 ft: 15252 corp: 41/1172b lim: 35 exec/s: 34 rss: 69Mb L: 27/34 MS: 1 ShuffleBytes- 00:07:46.697 #69 DONE cov: 11759 ft: 15252 corp: 41/1172b lim: 35 exec/s: 34 rss: 69Mb 00:07:46.697 ###### Recommended dictionary. ###### 00:07:46.697 "\253\236\000`_\362.\000" # Uses: 2 00:07:46.697 "\377\377\377\377\377\377\377\002" # Uses: 2 00:07:46.697 ###### End of recommended dictionary. ###### 00:07:46.697 Done 69 runs in 2 second(s) 00:07:46.957 16:16:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:46.957 16:16:15 -- ../common.sh@72 -- # (( i++ )) 00:07:46.957 16:16:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:46.957 16:16:15 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:46.957 16:16:15 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:46.957 16:16:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:46.957 16:16:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:46.957 16:16:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:46.957 16:16:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:46.957 16:16:15 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:46.957 16:16:15 -- nvmf/run.sh@29 -- # port=4405 00:07:46.957 16:16:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:46.957 16:16:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:46.957 16:16:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:46.957 16:16:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:46.957 [2024-07-20 16:16:15.629456] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:46.957 [2024-07-20 16:16:15.629532] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2271657 ] 00:07:46.957 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.217 [2024-07-20 16:16:15.808299] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.217 [2024-07-20 16:16:15.827936] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.217 [2024-07-20 16:16:15.828058] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.217 [2024-07-20 16:16:15.879548] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.217 [2024-07-20 16:16:15.895833] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:47.217 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.217 INFO: Seed: 4218850456 00:07:47.217 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:47.217 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:47.217 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:47.217 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.217 #2 INITED exec/s: 0 rss: 59Mb 00:07:47.217 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.217 This may also happen if the target rejected all inputs we tried so far 00:07:47.217 [2024-07-20 16:16:15.941264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.217 [2024-07-20 16:16:15.941293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.217 [2024-07-20 16:16:15.941344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.217 [2024-07-20 16:16:15.941358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.217 [2024-07-20 16:16:15.941407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.217 [2024-07-20 16:16:15.941424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.476 NEW_FUNC[1/671]: 0x499ab0 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:47.476 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:47.476 #7 NEW cov: 11543 ft: 11544 corp: 2/28b lim: 45 exec/s: 0 rss: 66Mb L: 27/27 MS: 5 ShuffleBytes-ChangeByte-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:47.476 [2024-07-20 16:16:16.273423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.476 [2024-07-20 16:16:16.273491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.476 [2024-07-20 16:16:16.273643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.476 [2024-07-20 16:16:16.273671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.476 [2024-07-20 16:16:16.273807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.476 [2024-07-20 16:16:16.273837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.736 #13 NEW cov: 11656 ft: 12328 corp: 3/55b lim: 45 exec/s: 0 rss: 66Mb L: 27/27 MS: 1 ChangeByte- 00:07:47.736 [2024-07-20 16:16:16.323495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.323525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.323635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.323653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.323775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.323791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.323905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.323923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.736 #18 NEW cov: 11662 ft: 12833 corp: 4/94b lim: 45 exec/s: 0 rss: 66Mb L: 39/39 MS: 5 ChangeBit-InsertByte-ChangeBit-CrossOver-InsertRepeatedBytes- 00:07:47.736 [2024-07-20 16:16:16.363684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.363714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.363830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.363847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.363966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adad56ad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.363986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.364105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.364125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.736 #19 NEW cov: 11747 ft: 13050 corp: 5/133b lim: 45 exec/s: 0 rss: 66Mb L: 39/39 MS: 1 CrossOver- 00:07:47.736 [2024-07-20 16:16:16.413812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.413842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.413952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.413970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.414080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7ead56ad cdw11:ad560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.414096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.414215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.414233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.736 #20 NEW cov: 11747 ft: 13177 corp: 6/173b lim: 45 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 InsertByte- 00:07:47.736 [2024-07-20 16:16:16.463710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.463739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.463859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.463876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.463992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.464010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.736 #21 NEW cov: 11747 ft: 13204 corp: 7/200b lim: 45 exec/s: 0 rss: 66Mb L: 27/40 MS: 1 ShuffleBytes- 00:07:47.736 [2024-07-20 16:16:16.514163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.514194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.514320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.514341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.514449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adad56ad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.514470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.736 [2024-07-20 16:16:16.514592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565657 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.736 [2024-07-20 16:16:16.514609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.736 #22 NEW cov: 11747 ft: 13270 corp: 8/239b lim: 45 exec/s: 0 rss: 67Mb L: 39/40 MS: 1 ChangeBit- 00:07:47.996 [2024-07-20 16:16:16.553700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ad563e1a cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.996 [2024-07-20 16:16:16.553731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.996 [2024-07-20 16:16:16.553855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.996 [2024-07-20 16:16:16.553872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.997 #25 NEW cov: 11747 ft: 13559 corp: 9/257b lim: 45 exec/s: 0 rss: 67Mb L: 18/40 MS: 3 ChangeByte-ChangeBit-CrossOver- 00:07:47.997 [2024-07-20 16:16:16.594327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.594354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.594472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.594490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.594594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56ad5656 cdw11:7ead0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.594611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.594726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.594743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.997 #26 NEW cov: 11747 ft: 13605 corp: 10/299b lim: 45 exec/s: 0 rss: 67Mb L: 42/42 MS: 1 CopyPart- 00:07:47.997 [2024-07-20 16:16:16.644244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adac0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.644274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.644403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.644420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.644533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.644549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.997 #27 NEW cov: 11747 ft: 13671 corp: 11/326b lim: 45 exec/s: 0 rss: 67Mb L: 27/42 MS: 1 ChangeBit- 00:07:47.997 [2024-07-20 16:16:16.684558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.684585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.684704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.684737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.684847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7ead56ad cdw11:ad560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.684865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.684984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.685004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.997 #28 NEW cov: 11747 ft: 13746 corp: 12/366b lim: 45 exec/s: 0 rss: 67Mb L: 40/42 MS: 1 ChangeBinInt- 00:07:47.997 [2024-07-20 16:16:16.724464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.724492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.724608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.724625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.724743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.724759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.997 #29 NEW cov: 11747 ft: 13790 corp: 13/393b lim: 45 exec/s: 0 rss: 67Mb L: 27/42 MS: 1 ShuffleBytes- 00:07:47.997 [2024-07-20 16:16:16.764465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.764493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.764611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.764629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.997 [2024-07-20 16:16:16.764751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.997 [2024-07-20 16:16:16.764769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.997 #30 NEW cov: 11747 ft: 13890 corp: 14/428b lim: 45 exec/s: 0 rss: 67Mb L: 35/42 MS: 1 CrossOver- 00:07:48.257 [2024-07-20 16:16:16.804635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.804663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.804781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adacadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.804800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.804912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.804928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.257 #31 NEW cov: 11747 ft: 13978 corp: 15/455b lim: 45 exec/s: 0 rss: 67Mb L: 27/42 MS: 1 ChangeBit- 00:07:48.257 [2024-07-20 16:16:16.844975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.845004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.845122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.845138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.845247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56ad5656 cdw11:7ead0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.845266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.845377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56a70002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.845395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.257 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:48.257 #32 NEW cov: 11770 ft: 14028 corp: 16/497b lim: 45 exec/s: 0 rss: 67Mb L: 42/42 MS: 1 ChangeBinInt- 00:07:48.257 [2024-07-20 16:16:16.895229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.895256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.895384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.895402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.895510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7ead56ad cdw11:ad520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.895526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.895638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.895655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.935330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.935358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.935492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c4565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.935510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.935626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7ead56ad cdw11:ad520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.935643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.935761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.935779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.257 #34 NEW cov: 11770 ft: 14039 corp: 17/537b lim: 45 exec/s: 34 rss: 67Mb L: 40/42 MS: 2 ChangeBit-ChangeByte- 00:07:48.257 [2024-07-20 16:16:16.975498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.975526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.975642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.975659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.975772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56ad5656 cdw11:7ead0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.975787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:16.975903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56a70002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:16.975921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.257 #35 NEW cov: 11770 ft: 14072 corp: 18/579b lim: 45 exec/s: 35 rss: 67Mb L: 42/42 MS: 1 ChangeBinInt- 00:07:48.257 [2024-07-20 16:16:17.015810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:17.015836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:17.015946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:17.015963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:17.016084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:17.016101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:17.016219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.257 [2024-07-20 16:16:17.016234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.257 [2024-07-20 16:16:17.016340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.258 [2024-07-20 16:16:17.016360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.258 #36 NEW cov: 11770 ft: 14142 corp: 19/624b lim: 45 exec/s: 36 rss: 67Mb L: 45/45 MS: 1 CopyPart- 00:07:48.258 [2024-07-20 16:16:17.055739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.258 [2024-07-20 16:16:17.055768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.258 [2024-07-20 16:16:17.055886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadeaad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.258 [2024-07-20 16:16:17.055902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.258 [2024-07-20 16:16:17.056016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:5656ad56 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.258 [2024-07-20 16:16:17.056033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.258 [2024-07-20 16:16:17.056145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.258 [2024-07-20 16:16:17.056161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.518 #37 NEW cov: 11770 ft: 14166 corp: 20/660b lim: 45 exec/s: 37 rss: 68Mb L: 36/45 MS: 1 InsertByte- 00:07:48.518 [2024-07-20 16:16:17.095454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.095483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.095605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.095624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.095732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.095747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.518 #38 NEW cov: 11770 ft: 14174 corp: 21/695b lim: 45 exec/s: 38 rss: 68Mb L: 35/45 MS: 1 ChangeByte- 00:07:48.518 [2024-07-20 16:16:17.135719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.135747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.135862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adacadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.135878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.135996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.136012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.518 #39 NEW cov: 11770 ft: 14196 corp: 22/722b lim: 45 exec/s: 39 rss: 68Mb L: 27/45 MS: 1 ShuffleBytes- 00:07:48.518 [2024-07-20 16:16:17.175738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.175766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.175885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.175903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.176016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.176033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.518 #40 NEW cov: 11770 ft: 14217 corp: 23/749b lim: 45 exec/s: 40 rss: 68Mb L: 27/45 MS: 1 ChangeBit- 00:07:48.518 [2024-07-20 16:16:17.216138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.216164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.216277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.216294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.216411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56ad5656 cdw11:7ead0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.216428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.216550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56a70002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.216567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.518 #41 NEW cov: 11770 ft: 14232 corp: 24/791b lim: 45 exec/s: 41 rss: 68Mb L: 42/45 MS: 1 ChangeByte- 00:07:48.518 [2024-07-20 16:16:17.255783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:94563e1a cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.255809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.255919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.255935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.518 #42 NEW cov: 11770 ft: 14238 corp: 25/809b lim: 45 exec/s: 42 rss: 68Mb L: 18/45 MS: 1 ChangeByte- 00:07:48.518 [2024-07-20 16:16:17.296154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.296183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.296296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.296313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.518 [2024-07-20 16:16:17.296422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.518 [2024-07-20 16:16:17.296444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.518 #43 NEW cov: 11770 ft: 14241 corp: 26/836b lim: 45 exec/s: 43 rss: 68Mb L: 27/45 MS: 1 ChangeByte- 00:07:48.778 [2024-07-20 16:16:17.336566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.778 [2024-07-20 16:16:17.336593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.778 [2024-07-20 16:16:17.336718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.778 [2024-07-20 16:16:17.336736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.778 [2024-07-20 16:16:17.336852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adad56ad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.778 [2024-07-20 16:16:17.336867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.778 [2024-07-20 16:16:17.336985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565657 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.778 [2024-07-20 16:16:17.337003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.778 #44 NEW cov: 11770 ft: 14257 corp: 27/875b lim: 45 exec/s: 44 rss: 68Mb L: 39/45 MS: 1 ChangeByte- 00:07:48.778 [2024-07-20 16:16:17.376395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.778 [2024-07-20 16:16:17.376423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.376536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.376554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.376672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:acadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.376688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.779 #45 NEW cov: 11770 ft: 14294 corp: 28/910b lim: 45 exec/s: 45 rss: 68Mb L: 35/45 MS: 1 CrossOver- 00:07:48.779 [2024-07-20 16:16:17.416748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.416776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.416904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.416922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.417037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7ead56ad cdw11:ad560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.417055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.417172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.417190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.779 #46 NEW cov: 11770 ft: 14298 corp: 29/950b lim: 45 exec/s: 46 rss: 68Mb L: 40/45 MS: 1 CrossOver- 00:07:48.779 [2024-07-20 16:16:17.456939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.456968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.457091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.457108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.457225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7ead56ad cdw11:ad560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.457242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.457348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.457365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.779 #47 NEW cov: 11770 ft: 14303 corp: 30/990b lim: 45 exec/s: 47 rss: 68Mb L: 40/45 MS: 1 ChangeBit- 00:07:48.779 [2024-07-20 16:16:17.507106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56160002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.507135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.507254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.507272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.507390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adad56ad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.507408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.507518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565657 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.507534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.779 #48 NEW cov: 11770 ft: 14307 corp: 31/1029b lim: 45 exec/s: 48 rss: 68Mb L: 39/45 MS: 1 ChangeBit- 00:07:48.779 [2024-07-20 16:16:17.546913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.546943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.547063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adacadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.547080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.779 [2024-07-20 16:16:17.547192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.779 [2024-07-20 16:16:17.547211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.779 #49 NEW cov: 11770 ft: 14328 corp: 32/1057b lim: 45 exec/s: 49 rss: 68Mb L: 28/45 MS: 1 InsertByte- 00:07:49.039 [2024-07-20 16:16:17.587386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.039 [2024-07-20 16:16:17.587430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.039 [2024-07-20 16:16:17.587557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadeaad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.039 [2024-07-20 16:16:17.587576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.039 [2024-07-20 16:16:17.587703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:5656ad56 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.039 [2024-07-20 16:16:17.587720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.039 [2024-07-20 16:16:17.587837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.039 [2024-07-20 16:16:17.587852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.039 [2024-07-20 16:16:17.627507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.039 [2024-07-20 16:16:17.627536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.039 [2024-07-20 16:16:17.627650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadeaad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.039 [2024-07-20 16:16:17.627668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.039 [2024-07-20 16:16:17.627785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:5656ad56 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.039 [2024-07-20 16:16:17.627801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.039 [2024-07-20 16:16:17.627915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.039 [2024-07-20 16:16:17.627932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.039 #51 NEW cov: 11770 ft: 14338 corp: 33/1093b lim: 45 exec/s: 51 rss: 68Mb L: 36/45 MS: 2 ChangeBit-ChangeByte- 00:07:49.039 [2024-07-20 16:16:17.667633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:56561aad cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.667662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.667775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.667793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.667912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56ad5656 cdw11:7ead0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.667931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.668042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56a70002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.668060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.040 #52 NEW cov: 11770 ft: 14339 corp: 34/1136b lim: 45 exec/s: 52 rss: 69Mb L: 43/45 MS: 1 InsertByte- 00:07:49.040 [2024-07-20 16:16:17.707375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.707402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.707518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.707533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.707649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.707665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.040 #53 NEW cov: 11770 ft: 14343 corp: 35/1171b lim: 45 exec/s: 53 rss: 69Mb L: 35/45 MS: 1 ChangeBit- 00:07:49.040 [2024-07-20 16:16:17.747771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad2431 cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.747801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.747918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.747935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.748056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adacadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.748074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.748190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.748207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.040 #54 NEW cov: 11770 ft: 14351 corp: 36/1207b lim: 45 exec/s: 54 rss: 69Mb L: 36/45 MS: 1 InsertByte- 00:07:49.040 [2024-07-20 16:16:17.787945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.787974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.788088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.788107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.788220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:5656ad56 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.788241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.040 [2024-07-20 16:16:17.788360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.788378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.040 #60 NEW cov: 11770 ft: 14357 corp: 37/1243b lim: 45 exec/s: 60 rss: 69Mb L: 36/45 MS: 1 InsertByte- 00:07:49.040 [2024-07-20 16:16:17.827275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ad563e1a cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.040 [2024-07-20 16:16:17.827305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.301 #61 NEW cov: 11770 ft: 15091 corp: 38/1260b lim: 45 exec/s: 61 rss: 69Mb L: 17/45 MS: 1 EraseBytes- 00:07:49.301 [2024-07-20 16:16:17.868429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.301 [2024-07-20 16:16:17.868461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.301 [2024-07-20 16:16:17.868577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.301 [2024-07-20 16:16:17.868595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.301 [2024-07-20 16:16:17.868710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.301 [2024-07-20 16:16:17.868725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.301 [2024-07-20 16:16:17.868836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.301 [2024-07-20 16:16:17.868852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.301 [2024-07-20 16:16:17.868975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.301 [2024-07-20 16:16:17.868993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.301 #62 NEW cov: 11770 ft: 15156 corp: 39/1305b lim: 45 exec/s: 62 rss: 69Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:49.301 [2024-07-20 16:16:17.917755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad31ad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.301 [2024-07-20 16:16:17.917783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.301 [2024-07-20 16:16:17.917893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.301 [2024-07-20 16:16:17.917909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.301 #63 NEW cov: 11770 ft: 15161 corp: 40/1323b lim: 45 exec/s: 31 rss: 69Mb L: 18/45 MS: 1 EraseBytes- 00:07:49.301 #63 DONE cov: 11770 ft: 15161 corp: 40/1323b lim: 45 exec/s: 31 rss: 69Mb 00:07:49.301 Done 63 runs in 2 second(s) 00:07:49.301 16:16:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:49.301 16:16:18 -- ../common.sh@72 -- # (( i++ )) 00:07:49.301 16:16:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.301 16:16:18 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:49.301 16:16:18 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:49.301 16:16:18 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.301 16:16:18 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.301 16:16:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:49.301 16:16:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:49.301 16:16:18 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:49.301 16:16:18 -- nvmf/run.sh@29 -- # port=4406 00:07:49.301 16:16:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:49.301 16:16:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:49.301 16:16:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.301 16:16:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:49.301 [2024-07-20 16:16:18.093584] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:49.301 [2024-07-20 16:16:18.093653] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2272167 ] 00:07:49.561 EAL: No free 2048 kB hugepages reported on node 1 00:07:49.561 [2024-07-20 16:16:18.269987] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.561 [2024-07-20 16:16:18.289154] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:49.561 [2024-07-20 16:16:18.289276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.561 [2024-07-20 16:16:18.340578] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:49.561 [2024-07-20 16:16:18.356870] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:49.821 INFO: Running with entropic power schedule (0xFF, 100). 00:07:49.821 INFO: Seed: 2385898204 00:07:49.821 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:49.821 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:49.821 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:49.821 INFO: A corpus is not provided, starting from an empty corpus 00:07:49.821 #2 INITED exec/s: 0 rss: 59Mb 00:07:49.821 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:49.821 This may also happen if the target rejected all inputs we tried so far 00:07:49.821 [2024-07-20 16:16:18.432778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004d0a cdw11:00000000 00:07:49.821 [2024-07-20 16:16:18.432814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.126 NEW_FUNC[1/669]: 0x49c2c0 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:50.126 NEW_FUNC[2/669]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.126 #6 NEW cov: 11457 ft: 11458 corp: 2/3b lim: 10 exec/s: 0 rss: 66Mb L: 2/2 MS: 4 ChangeByte-ChangeByte-CrossOver-CrossOver- 00:07:50.126 [2024-07-20 16:16:18.764187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004dff cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.764227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.764345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.764364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.764500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.764516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.126 #7 NEW cov: 11573 ft: 12181 corp: 3/9b lim: 10 exec/s: 0 rss: 66Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:50.126 [2024-07-20 16:16:18.814368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004dff cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.814399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.814520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.814538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.814649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.814666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.814776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.814794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.126 #8 NEW cov: 11579 ft: 12675 corp: 4/18b lim: 10 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 CopyPart- 00:07:50.126 [2024-07-20 16:16:18.864673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004dff cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.864700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.864825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005959 cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.864842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.864962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005959 cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.864979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.865095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.865112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.126 [2024-07-20 16:16:18.865224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.126 [2024-07-20 16:16:18.865241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.126 #9 NEW cov: 11664 ft: 13081 corp: 5/28b lim: 10 exec/s: 0 rss: 66Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:50.446 [2024-07-20 16:16:18.904060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cd0a cdw11:00000000 00:07:50.446 [2024-07-20 16:16:18.904092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.446 #10 NEW cov: 11664 ft: 13256 corp: 6/30b lim: 10 exec/s: 0 rss: 66Mb L: 2/10 MS: 1 ChangeBit- 00:07:50.446 [2024-07-20 16:16:18.944184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:50.446 [2024-07-20 16:16:18.944213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.446 #11 NEW cov: 11664 ft: 13311 corp: 7/32b lim: 10 exec/s: 0 rss: 66Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:50.446 [2024-07-20 16:16:18.983853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a02 cdw11:00000000 00:07:50.446 [2024-07-20 16:16:18.983880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.446 #12 NEW cov: 11664 ft: 13375 corp: 8/34b lim: 10 exec/s: 0 rss: 66Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:50.446 [2024-07-20 16:16:19.025028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004dff cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.025055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.025180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.025207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.025318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff09 cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.025336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.025455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.025472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.446 #13 NEW cov: 11664 ft: 13399 corp: 9/43b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:50.446 [2024-07-20 16:16:19.064561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000202 cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.064591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.446 #15 NEW cov: 11664 ft: 13447 corp: 10/46b lim: 10 exec/s: 0 rss: 67Mb L: 3/10 MS: 2 EraseBytes-CrossOver- 00:07:50.446 [2024-07-20 16:16:19.115231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fafa cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.115262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.115390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000fafa cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.115407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.115513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000fa02 cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.115531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.115657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000020a cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.115673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.446 #16 NEW cov: 11664 ft: 13534 corp: 11/54b lim: 10 exec/s: 0 rss: 67Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:50.446 [2024-07-20 16:16:19.175563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004dff cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.175595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.175720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.175740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.175854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffd0 cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.175873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.175987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.176006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.446 #17 NEW cov: 11664 ft: 13552 corp: 12/63b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 ChangeByte- 00:07:50.446 [2024-07-20 16:16:19.215250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004dff cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.215280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.215401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.215419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.215540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.215561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.446 [2024-07-20 16:16:19.215682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.446 [2024-07-20 16:16:19.215699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.446 #18 NEW cov: 11664 ft: 13568 corp: 13/72b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 ChangeBit- 00:07:50.705 [2024-07-20 16:16:19.265234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.265264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.705 #19 NEW cov: 11664 ft: 13629 corp: 14/74b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 CrossOver- 00:07:50.705 [2024-07-20 16:16:19.315338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000202 cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.315368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.705 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:50.705 #20 NEW cov: 11687 ft: 13712 corp: 15/76b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 CopyPart- 00:07:50.705 [2024-07-20 16:16:19.375935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009696 cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.375967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.705 [2024-07-20 16:16:19.376090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009696 cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.376108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.705 [2024-07-20 16:16:19.376229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00009608 cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.376246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.705 #23 NEW cov: 11687 ft: 13742 corp: 16/82b lim: 10 exec/s: 0 rss: 67Mb L: 6/10 MS: 3 CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:07:50.705 [2024-07-20 16:16:19.425886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000047ff cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.425917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.705 [2024-07-20 16:16:19.426048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.426065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.705 #25 NEW cov: 11687 ft: 13887 corp: 17/86b lim: 10 exec/s: 25 rss: 67Mb L: 4/10 MS: 2 ChangeByte-CrossOver- 00:07:50.705 [2024-07-20 16:16:19.476005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004d0a cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.476036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.705 [2024-07-20 16:16:19.476169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.705 [2024-07-20 16:16:19.476187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.705 #26 NEW cov: 11687 ft: 13905 corp: 18/90b lim: 10 exec/s: 26 rss: 67Mb L: 4/10 MS: 1 CrossOver- 00:07:50.964 [2024-07-20 16:16:19.536036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000da02 cdw11:00000000 00:07:50.964 [2024-07-20 16:16:19.536068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.964 #27 NEW cov: 11687 ft: 13956 corp: 19/92b lim: 10 exec/s: 27 rss: 68Mb L: 2/10 MS: 1 ChangeByte- 00:07:50.964 [2024-07-20 16:16:19.587001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004dff cdw11:00000000 00:07:50.964 [2024-07-20 16:16:19.587032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.964 [2024-07-20 16:16:19.587160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.964 [2024-07-20 16:16:19.587180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.587305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.587324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.587449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff55 cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.587468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.587592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.587610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.965 #28 NEW cov: 11687 ft: 13979 corp: 20/102b lim: 10 exec/s: 28 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:07:50.965 [2024-07-20 16:16:19.636984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000dff cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.637013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.637125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.637144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.637261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.637281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.637379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.637398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.965 #29 NEW cov: 11687 ft: 13988 corp: 21/111b lim: 10 exec/s: 29 rss: 68Mb L: 9/10 MS: 1 ChangeBit- 00:07:50.965 [2024-07-20 16:16:19.687256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003838 cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.687288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.687414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003838 cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.687433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.687557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003838 cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.687574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.687693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00003838 cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.687710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.687821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000da02 cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.687840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.965 #30 NEW cov: 11687 ft: 14008 corp: 22/121b lim: 10 exec/s: 30 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:50.965 [2024-07-20 16:16:19.736676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004d0a cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.736706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.736823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff09 cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.736839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.965 [2024-07-20 16:16:19.736953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.965 [2024-07-20 16:16:19.736971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.965 #31 NEW cov: 11687 ft: 14087 corp: 23/128b lim: 10 exec/s: 31 rss: 68Mb L: 7/10 MS: 1 EraseBytes- 00:07:51.224 [2024-07-20 16:16:19.797425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000dff cdw11:00000000 00:07:51.224 [2024-07-20 16:16:19.797458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.224 [2024-07-20 16:16:19.797588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:07:51.224 [2024-07-20 16:16:19.797606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.224 [2024-07-20 16:16:19.797721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:51.224 [2024-07-20 16:16:19.797739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.224 [2024-07-20 16:16:19.797857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.224 [2024-07-20 16:16:19.797875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.224 #32 NEW cov: 11687 ft: 14147 corp: 24/137b lim: 10 exec/s: 32 rss: 68Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:51.224 [2024-07-20 16:16:19.846890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000b4d cdw11:00000000 00:07:51.224 [2024-07-20 16:16:19.846920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.224 [2024-07-20 16:16:19.847046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.224 [2024-07-20 16:16:19.847075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.224 [2024-07-20 16:16:19.847193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.224 [2024-07-20 16:16:19.847209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.224 #34 NEW cov: 11687 ft: 14154 corp: 25/144b lim: 10 exec/s: 34 rss: 68Mb L: 7/10 MS: 2 ChangeBit-CrossOver- 00:07:51.224 [2024-07-20 16:16:19.897914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003838 cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.897943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.225 [2024-07-20 16:16:19.898056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003838 cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.898074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.225 [2024-07-20 16:16:19.898196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003838 cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.898214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.225 [2024-07-20 16:16:19.898334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000038da cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.898352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.225 [2024-07-20 16:16:19.898466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003802 cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.898493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.225 #35 NEW cov: 11687 ft: 14169 corp: 26/154b lim: 10 exec/s: 35 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:51.225 [2024-07-20 16:16:19.947411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004d09 cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.947439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.225 [2024-07-20 16:16:19.947561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.947579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.225 #36 NEW cov: 11687 ft: 14195 corp: 27/159b lim: 10 exec/s: 36 rss: 68Mb L: 5/10 MS: 1 EraseBytes- 00:07:51.225 [2024-07-20 16:16:19.987742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fafa cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.987770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.225 [2024-07-20 16:16:19.987888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002bfa cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.987906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.225 [2024-07-20 16:16:19.988029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000fa02 cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.988046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.225 [2024-07-20 16:16:19.988152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000020a cdw11:00000000 00:07:51.225 [2024-07-20 16:16:19.988170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.225 #37 NEW cov: 11687 ft: 14226 corp: 28/167b lim: 10 exec/s: 37 rss: 68Mb L: 8/10 MS: 1 ChangeByte- 00:07:51.484 [2024-07-20 16:16:20.047997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004d24 cdw11:00000000 00:07:51.484 [2024-07-20 16:16:20.048026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.484 [2024-07-20 16:16:20.048151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.484 [2024-07-20 16:16:20.048168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.484 [2024-07-20 16:16:20.048290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:51.484 [2024-07-20 16:16:20.048307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.484 #38 NEW cov: 11687 ft: 14262 corp: 29/173b lim: 10 exec/s: 38 rss: 68Mb L: 6/10 MS: 1 ChangeByte- 00:07:51.484 [2024-07-20 16:16:20.098114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.098145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.098269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004d0a cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.098287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.098402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff09 cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.098420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.485 #39 NEW cov: 11687 ft: 14271 corp: 30/180b lim: 10 exec/s: 39 rss: 68Mb L: 7/10 MS: 1 CrossOver- 00:07:51.485 [2024-07-20 16:16:20.138239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003838 cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.138268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.138387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003838 cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.138406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.138524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000038da cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.138545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.138669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000038da cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.138686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.138819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003802 cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.138851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.485 #40 NEW cov: 11687 ft: 14286 corp: 31/190b lim: 10 exec/s: 40 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:51.485 [2024-07-20 16:16:20.198451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009696 cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.198492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.198613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000964d cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.198630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.198748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.198764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.485 #41 NEW cov: 11687 ft: 14303 corp: 32/196b lim: 10 exec/s: 41 rss: 69Mb L: 6/10 MS: 1 CrossOver- 00:07:51.485 [2024-07-20 16:16:20.248347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.248375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.485 [2024-07-20 16:16:20.248496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.485 [2024-07-20 16:16:20.248515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.485 #42 NEW cov: 11687 ft: 14355 corp: 33/201b lim: 10 exec/s: 42 rss: 69Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:07:51.745 [2024-07-20 16:16:20.298235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dada cdw11:00000000 00:07:51.745 [2024-07-20 16:16:20.298265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.745 #43 NEW cov: 11687 ft: 14402 corp: 34/204b lim: 10 exec/s: 43 rss: 69Mb L: 3/10 MS: 1 CopyPart- 00:07:51.745 [2024-07-20 16:16:20.338556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004dff cdw11:00000000 00:07:51.745 [2024-07-20 16:16:20.338587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.745 [2024-07-20 16:16:20.338711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:51.745 [2024-07-20 16:16:20.338728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.745 [2024-07-20 16:16:20.338855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff09 cdw11:00000000 00:07:51.745 [2024-07-20 16:16:20.338872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.745 [2024-07-20 16:16:20.338996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.745 [2024-07-20 16:16:20.339017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.745 #44 NEW cov: 11687 ft: 14415 corp: 35/213b lim: 10 exec/s: 44 rss: 69Mb L: 9/10 MS: 1 CopyPart- 00:07:51.745 [2024-07-20 16:16:20.388499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dada cdw11:00000000 00:07:51.745 [2024-07-20 16:16:20.388529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.745 #45 NEW cov: 11687 ft: 14422 corp: 36/215b lim: 10 exec/s: 22 rss: 69Mb L: 2/10 MS: 1 EraseBytes- 00:07:51.745 #45 DONE cov: 11687 ft: 14422 corp: 36/215b lim: 10 exec/s: 22 rss: 69Mb 00:07:51.745 Done 45 runs in 2 second(s) 00:07:51.745 16:16:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:51.745 16:16:20 -- ../common.sh@72 -- # (( i++ )) 00:07:51.745 16:16:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.745 16:16:20 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:51.745 16:16:20 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:51.745 16:16:20 -- nvmf/run.sh@24 -- # local timen=1 00:07:51.745 16:16:20 -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.745 16:16:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:51.745 16:16:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:51.745 16:16:20 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:51.745 16:16:20 -- nvmf/run.sh@29 -- # port=4407 00:07:51.745 16:16:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:51.745 16:16:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:51.745 16:16:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.745 16:16:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:52.005 [2024-07-20 16:16:20.571405] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:52.005 [2024-07-20 16:16:20.571478] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2272489 ] 00:07:52.005 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.005 [2024-07-20 16:16:20.749739] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.005 [2024-07-20 16:16:20.769040] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.005 [2024-07-20 16:16:20.769179] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.265 [2024-07-20 16:16:20.820546] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.265 [2024-07-20 16:16:20.836871] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:52.265 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.265 INFO: Seed: 569917523 00:07:52.265 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:52.265 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:52.265 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:52.265 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.265 #2 INITED exec/s: 0 rss: 59Mb 00:07:52.265 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.265 This may also happen if the target rejected all inputs we tried so far 00:07:52.265 [2024-07-20 16:16:20.903399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000012e cdw11:00000000 00:07:52.265 [2024-07-20 16:16:20.903448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.265 [2024-07-20 16:16:20.903555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f262 cdw11:00000000 00:07:52.265 [2024-07-20 16:16:20.903572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.265 [2024-07-20 16:16:20.903669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009a03 cdw11:00000000 00:07:52.265 [2024-07-20 16:16:20.903696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.265 [2024-07-20 16:16:20.903813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001faa cdw11:00000000 00:07:52.265 [2024-07-20 16:16:20.903831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.525 NEW_FUNC[1/666]: 0x49ccb0 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:52.525 NEW_FUNC[2/666]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:52.525 #5 NEW cov: 11433 ft: 11452 corp: 2/10b lim: 10 exec/s: 0 rss: 66Mb L: 9/9 MS: 3 ShuffleBytes-ChangeBit-CMP- DE: "\001.\362b\232\003\037\252"- 00:07:52.525 [2024-07-20 16:16:21.234491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:52.525 [2024-07-20 16:16:21.234536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.525 [2024-07-20 16:16:21.234678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002ef2 cdw11:00000000 00:07:52.525 [2024-07-20 16:16:21.234698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.525 [2024-07-20 16:16:21.234829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000629a cdw11:00000000 00:07:52.525 [2024-07-20 16:16:21.234851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.525 [2024-07-20 16:16:21.234974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000031f cdw11:00000000 00:07:52.525 [2024-07-20 16:16:21.234994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.525 NEW_FUNC[1/3]: 0x16db3b0 in nvme_qpair_check_enabled /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:637 00:07:52.525 NEW_FUNC[2/3]: 0x1c72310 in spdk_thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1151 00:07:52.525 #6 NEW cov: 11573 ft: 12100 corp: 3/19b lim: 10 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 PersAutoDict- DE: "\001.\362b\232\003\037\252"- 00:07:52.525 [2024-07-20 16:16:21.283738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000010a cdw11:00000000 00:07:52.525 [2024-07-20 16:16:21.283770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.525 #7 NEW cov: 11579 ft: 12586 corp: 4/21b lim: 10 exec/s: 0 rss: 66Mb L: 2/9 MS: 1 CrossOver- 00:07:52.785 [2024-07-20 16:16:21.334021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.334049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.785 #8 NEW cov: 11664 ft: 12818 corp: 5/23b lim: 10 exec/s: 0 rss: 66Mb L: 2/9 MS: 1 ChangeByte- 00:07:52.785 [2024-07-20 16:16:21.384794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.384826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.384950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.384966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.385085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.385100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.385221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.385238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.785 #9 NEW cov: 11664 ft: 12920 corp: 6/32b lim: 10 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:52.785 [2024-07-20 16:16:21.435147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.435174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.435293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.435310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.435421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.435440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.435566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff7a cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.435582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.435705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.435722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.785 #10 NEW cov: 11664 ft: 13051 corp: 7/42b lim: 10 exec/s: 0 rss: 66Mb L: 10/10 MS: 1 InsertByte- 00:07:52.785 [2024-07-20 16:16:21.485158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.485186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.485314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000feff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.485330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.485450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.485479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.785 [2024-07-20 16:16:21.485601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.485620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.785 #11 NEW cov: 11664 ft: 13146 corp: 8/51b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 ChangeBit- 00:07:52.785 [2024-07-20 16:16:21.534678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:07:52.785 [2024-07-20 16:16:21.534707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.785 #12 NEW cov: 11664 ft: 13196 corp: 9/53b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:53.044 [2024-07-20 16:16:21.595031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000010a cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.595062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.044 [2024-07-20 16:16:21.595148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000010a cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.595166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.044 #13 NEW cov: 11664 ft: 13493 corp: 10/57b lim: 10 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 CopyPart- 00:07:53.044 [2024-07-20 16:16:21.645217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.645246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.044 [2024-07-20 16:16:21.645370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000010a cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.645388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.044 #14 NEW cov: 11664 ft: 13566 corp: 11/61b lim: 10 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 CrossOver- 00:07:53.044 [2024-07-20 16:16:21.695373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.695402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.044 [2024-07-20 16:16:21.695539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002a cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.695558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.044 #16 NEW cov: 11664 ft: 13585 corp: 12/65b lim: 10 exec/s: 0 rss: 67Mb L: 4/10 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:53.044 [2024-07-20 16:16:21.745597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000002a cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.745626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.044 [2024-07-20 16:16:21.745753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.745770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.044 #17 NEW cov: 11664 ft: 13607 corp: 13/69b lim: 10 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:53.044 [2024-07-20 16:16:21.795424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.044 [2024-07-20 16:16:21.795456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.045 #18 NEW cov: 11664 ft: 13656 corp: 14/71b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 CopyPart- 00:07:53.045 [2024-07-20 16:16:21.846492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:53.045 [2024-07-20 16:16:21.846520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.045 [2024-07-20 16:16:21.846642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.045 [2024-07-20 16:16:21.846662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.045 [2024-07-20 16:16:21.846786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.045 [2024-07-20 16:16:21.846803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.045 [2024-07-20 16:16:21.846927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.045 [2024-07-20 16:16:21.846944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.045 [2024-07-20 16:16:21.847069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.045 [2024-07-20 16:16:21.847087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.304 #19 NEW cov: 11664 ft: 13672 corp: 15/81b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CopyPart- 00:07:53.304 [2024-07-20 16:16:21.896669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:07:53.304 [2024-07-20 16:16:21.896698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:21.896830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000010a cdw11:00000000 00:07:53.304 [2024-07-20 16:16:21.896848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:21.896969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009494 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:21.896986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:21.897122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00009494 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:21.897139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:21.897215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00009494 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:21.897234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.304 #20 NEW cov: 11664 ft: 13689 corp: 16/91b lim: 10 exec/s: 20 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:53.304 [2024-07-20 16:16:21.956256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:07:53.304 [2024-07-20 16:16:21.956287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:21.956412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000280a cdw11:00000000 00:07:53.304 [2024-07-20 16:16:21.956430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.304 #21 NEW cov: 11664 ft: 13721 corp: 17/95b lim: 10 exec/s: 21 rss: 67Mb L: 4/10 MS: 1 CopyPart- 00:07:53.304 [2024-07-20 16:16:22.006637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006969 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:22.006665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:22.006790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006969 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:22.006811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:22.006937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000280a cdw11:00000000 00:07:53.304 [2024-07-20 16:16:22.006954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.304 #22 NEW cov: 11664 ft: 13899 corp: 18/101b lim: 10 exec/s: 22 rss: 67Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:07:53.304 [2024-07-20 16:16:22.056228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006969 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:22.056257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.304 #24 NEW cov: 11664 ft: 13953 corp: 19/104b lim: 10 exec/s: 24 rss: 67Mb L: 3/10 MS: 2 ShuffleBytes-CrossOver- 00:07:53.304 [2024-07-20 16:16:22.106989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:22.107019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:22.107135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:22.107151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.304 [2024-07-20 16:16:22.107266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.304 [2024-07-20 16:16:22.107284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.562 #25 NEW cov: 11664 ft: 14025 corp: 20/110b lim: 10 exec/s: 25 rss: 67Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:07:53.562 [2024-07-20 16:16:22.157304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006969 cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.157334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.157463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.157480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.157601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000069 cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.157619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.157745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006928 cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.157763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.562 #26 NEW cov: 11664 ft: 14040 corp: 21/119b lim: 10 exec/s: 26 rss: 67Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:53.562 [2024-07-20 16:16:22.217165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.217195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.217319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000282e cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.217336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.217460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000af2 cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.217482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.562 #27 NEW cov: 11664 ft: 14065 corp: 22/126b lim: 10 exec/s: 27 rss: 67Mb L: 7/10 MS: 1 CrossOver- 00:07:53.562 [2024-07-20 16:16:22.277206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000010a cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.277236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.277365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000110 cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.277383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.562 #28 NEW cov: 11664 ft: 14101 corp: 23/130b lim: 10 exec/s: 28 rss: 67Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:53.562 [2024-07-20 16:16:22.338009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.338038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.338166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.338185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.338308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.338327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.338439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff7a cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.338461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.562 [2024-07-20 16:16:22.338583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.562 [2024-07-20 16:16:22.338603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.562 #29 NEW cov: 11664 ft: 14123 corp: 24/140b lim: 10 exec/s: 29 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:53.820 [2024-07-20 16:16:22.397609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000010a cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.397646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.820 [2024-07-20 16:16:22.397774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000011e cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.397791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.820 #30 NEW cov: 11664 ft: 14132 corp: 25/145b lim: 10 exec/s: 30 rss: 68Mb L: 5/10 MS: 1 InsertByte- 00:07:53.820 [2024-07-20 16:16:22.457818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.457849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.820 [2024-07-20 16:16:22.457985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002ef2 cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.458004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.820 #31 NEW cov: 11664 ft: 14166 corp: 26/150b lim: 10 exec/s: 31 rss: 68Mb L: 5/10 MS: 1 EraseBytes- 00:07:53.820 [2024-07-20 16:16:22.507820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.507848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.820 #32 NEW cov: 11664 ft: 14174 corp: 27/153b lim: 10 exec/s: 32 rss: 68Mb L: 3/10 MS: 1 EraseBytes- 00:07:53.820 [2024-07-20 16:16:22.557930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002860 cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.557961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.820 #33 NEW cov: 11664 ft: 14204 corp: 28/155b lim: 10 exec/s: 33 rss: 68Mb L: 2/10 MS: 1 ChangeByte- 00:07:53.820 [2024-07-20 16:16:22.618833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000012e cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.618863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.820 [2024-07-20 16:16:22.618986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f201 cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.619001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.820 [2024-07-20 16:16:22.619117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.619135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.820 [2024-07-20 16:16:22.619246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000000aa cdw11:00000000 00:07:53.820 [2024-07-20 16:16:22.619263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.079 #34 NEW cov: 11664 ft: 14218 corp: 29/164b lim: 10 exec/s: 34 rss: 68Mb L: 9/10 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:54.079 [2024-07-20 16:16:22.668471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000012e cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.668502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.668624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f2aa cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.668642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.079 #35 NEW cov: 11664 ft: 14230 corp: 30/169b lim: 10 exec/s: 35 rss: 68Mb L: 5/10 MS: 1 EraseBytes- 00:07:54.079 [2024-07-20 16:16:22.718653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000010a cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.718681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.718803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000110 cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.718820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.079 #36 NEW cov: 11664 ft: 14243 corp: 31/173b lim: 10 exec/s: 36 rss: 68Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:54.079 [2024-07-20 16:16:22.768535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffec cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.768563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.079 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.079 #37 NEW cov: 11687 ft: 14283 corp: 32/175b lim: 10 exec/s: 37 rss: 68Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:54.079 [2024-07-20 16:16:22.819598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002601 cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.819626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.819744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002ef2 cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.819774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.819890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.819906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.820024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.820042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.820160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000aa02 cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.820176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.079 #38 NEW cov: 11687 ft: 14307 corp: 33/185b lim: 10 exec/s: 38 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:07:54.079 [2024-07-20 16:16:22.869808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002e0a cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.869837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.869963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f20a cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.869980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.870098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000128 cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.870115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.870233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002e0a cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.870251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.079 [2024-07-20 16:16:22.870372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000f262 cdw11:00000000 00:07:54.079 [2024-07-20 16:16:22.870389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.338 #39 NEW cov: 11687 ft: 14372 corp: 34/195b lim: 10 exec/s: 19 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:54.338 #39 DONE cov: 11687 ft: 14372 corp: 34/195b lim: 10 exec/s: 19 rss: 68Mb 00:07:54.338 ###### Recommended dictionary. ###### 00:07:54.338 "\001.\362b\232\003\037\252" # Uses: 1 00:07:54.338 "\001\000\000\000" # Uses: 0 00:07:54.338 ###### End of recommended dictionary. ###### 00:07:54.338 Done 39 runs in 2 second(s) 00:07:54.338 16:16:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:54.338 16:16:23 -- ../common.sh@72 -- # (( i++ )) 00:07:54.338 16:16:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.338 16:16:23 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:54.338 16:16:23 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:54.338 16:16:23 -- nvmf/run.sh@24 -- # local timen=1 00:07:54.338 16:16:23 -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.338 16:16:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:54.338 16:16:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:54.338 16:16:23 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:54.338 16:16:23 -- nvmf/run.sh@29 -- # port=4408 00:07:54.338 16:16:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:54.338 16:16:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:54.338 16:16:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.338 16:16:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:54.338 [2024-07-20 16:16:23.052815] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:54.338 [2024-07-20 16:16:23.052902] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2273028 ] 00:07:54.338 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.595 [2024-07-20 16:16:23.231035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.595 [2024-07-20 16:16:23.250259] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.595 [2024-07-20 16:16:23.250397] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.595 [2024-07-20 16:16:23.301659] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.595 [2024-07-20 16:16:23.317976] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:54.595 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.595 INFO: Seed: 3050937406 00:07:54.596 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:54.596 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:54.596 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:54.596 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.596 [2024-07-20 16:16:23.363269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.596 [2024-07-20 16:16:23.363298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.596 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 64Mb 00:07:54.596 [2024-07-20 16:16:23.393870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.596 [2024-07-20 16:16:23.393896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.596 [2024-07-20 16:16:23.393953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.596 [2024-07-20 16:16:23.393967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.596 [2024-07-20 16:16:23.394023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.596 [2024-07-20 16:16:23.394036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.596 [2024-07-20 16:16:23.394092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.596 [2024-07-20 16:16:23.394106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.596 [2024-07-20 16:16:23.394166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.596 [2024-07-20 16:16:23.394180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.853 #3 NEW cov: 11601 ft: 12762 corp: 2/6b lim: 5 exec/s: 0 rss: 65Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:54.853 [2024-07-20 16:16:23.443526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.853 [2024-07-20 16:16:23.443551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.853 [2024-07-20 16:16:23.443625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.853 [2024-07-20 16:16:23.443640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.853 #4 NEW cov: 11607 ft: 13317 corp: 3/8b lim: 5 exec/s: 0 rss: 65Mb L: 2/5 MS: 1 CrossOver- 00:07:54.853 [2024-07-20 16:16:23.483602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.853 [2024-07-20 16:16:23.483626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.853 [2024-07-20 16:16:23.483682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.853 [2024-07-20 16:16:23.483696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.853 #5 NEW cov: 11692 ft: 13534 corp: 4/10b lim: 5 exec/s: 0 rss: 65Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:54.854 [2024-07-20 16:16:23.523720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.523745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.854 [2024-07-20 16:16:23.523804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.523818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.854 #6 NEW cov: 11692 ft: 13598 corp: 5/12b lim: 5 exec/s: 0 rss: 65Mb L: 2/5 MS: 1 InsertByte- 00:07:54.854 [2024-07-20 16:16:23.563686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.563711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.854 #7 NEW cov: 11692 ft: 13761 corp: 6/13b lim: 5 exec/s: 0 rss: 65Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:54.854 [2024-07-20 16:16:23.604400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.604426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.854 [2024-07-20 16:16:23.604487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.604501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.854 [2024-07-20 16:16:23.604564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.604578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.854 [2024-07-20 16:16:23.604634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.604648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.854 [2024-07-20 16:16:23.604704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.604718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.854 #8 NEW cov: 11692 ft: 13845 corp: 7/18b lim: 5 exec/s: 0 rss: 65Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:54.854 [2024-07-20 16:16:23.643934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.854 [2024-07-20 16:16:23.643958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.112 #9 NEW cov: 11692 ft: 13903 corp: 8/19b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 CrossOver- 00:07:55.112 [2024-07-20 16:16:23.684205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.684231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.684288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.684302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.112 #10 NEW cov: 11692 ft: 13952 corp: 9/21b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:55.112 [2024-07-20 16:16:23.724792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.724818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.724893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.724907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.724965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.724979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.725036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.725050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.725107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.725121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.112 #11 NEW cov: 11692 ft: 13975 corp: 10/26b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:55.112 [2024-07-20 16:16:23.764406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.764433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.764497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.764511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.112 #12 NEW cov: 11692 ft: 14111 corp: 11/28b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 CMP- DE: "\376\377"- 00:07:55.112 [2024-07-20 16:16:23.804559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.804586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.804645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.804659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.112 #13 NEW cov: 11692 ft: 14136 corp: 12/30b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeBit- 00:07:55.112 [2024-07-20 16:16:23.845008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.845034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.845095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.845109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.845168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.845181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.845239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.845252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.112 #14 NEW cov: 11692 ft: 14203 corp: 13/34b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 CopyPart- 00:07:55.112 [2024-07-20 16:16:23.884765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.884791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.112 [2024-07-20 16:16:23.884849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.112 [2024-07-20 16:16:23.884863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.112 #15 NEW cov: 11692 ft: 14228 corp: 14/36b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:55.370 [2024-07-20 16:16:23.925330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:23.925356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:23.925415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:23.925429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:23.925456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:23.925467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:23.925539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:23.925553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:23.925613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:23.925627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.370 #16 NEW cov: 11692 ft: 14254 corp: 15/41b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 ChangeBit- 00:07:55.370 [2024-07-20 16:16:23.964832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:23.964858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.370 #17 NEW cov: 11692 ft: 14274 corp: 16/42b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeBit- 00:07:55.370 [2024-07-20 16:16:24.005472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.005498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:24.005559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.005572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:24.005632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.005646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:24.005705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.005718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.370 #18 NEW cov: 11692 ft: 14291 corp: 17/46b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 CrossOver- 00:07:55.370 [2024-07-20 16:16:24.045224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.045250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:24.045313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.045327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.370 #19 NEW cov: 11692 ft: 14332 corp: 18/48b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 CrossOver- 00:07:55.370 [2024-07-20 16:16:24.085341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.085367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:24.085421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.085434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.370 #20 NEW cov: 11692 ft: 14397 corp: 19/50b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:55.370 [2024-07-20 16:16:24.125352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.125378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.370 #21 NEW cov: 11692 ft: 14457 corp: 20/51b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeByte- 00:07:55.370 [2024-07-20 16:16:24.165925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.165950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:24.166008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.166022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:24.166078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.166091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.370 [2024-07-20 16:16:24.166147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.370 [2024-07-20 16:16:24.166160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.629 #22 NEW cov: 11692 ft: 14474 corp: 21/55b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:55.629 [2024-07-20 16:16:24.205578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-07-20 16:16:24.205604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.629 #23 NEW cov: 11692 ft: 14542 corp: 22/56b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeByte- 00:07:55.629 [2024-07-20 16:16:24.236285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-07-20 16:16:24.236310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.629 [2024-07-20 16:16:24.236371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-07-20 16:16:24.236385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.629 [2024-07-20 16:16:24.236440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-07-20 16:16:24.236458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.629 [2024-07-20 16:16:24.236517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-07-20 16:16:24.236531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.629 [2024-07-20 16:16:24.236588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-07-20 16:16:24.236601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.887 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:55.887 #24 NEW cov: 11715 ft: 14563 corp: 23/61b lim: 5 exec/s: 24 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:07:55.887 [2024-07-20 16:16:24.536947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.536979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.537032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.537046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.537095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.537109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.537157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.537170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.537221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.537233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.887 #25 NEW cov: 11715 ft: 14574 corp: 24/66b lim: 5 exec/s: 25 rss: 68Mb L: 5/5 MS: 1 InsertByte- 00:07:55.887 [2024-07-20 16:16:24.576983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.577008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.577059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.577072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.577125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.577138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.577188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.577201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.577251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.577264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.887 #26 NEW cov: 11715 ft: 14644 corp: 25/71b lim: 5 exec/s: 26 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:07:55.887 [2024-07-20 16:16:24.616972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.616998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.617051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.617064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.617114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.617128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.617178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.617191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.887 #27 NEW cov: 11715 ft: 14686 corp: 26/75b lim: 5 exec/s: 27 rss: 68Mb L: 4/5 MS: 1 PersAutoDict- DE: "\376\377"- 00:07:55.887 [2024-07-20 16:16:24.656806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.656830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.887 [2024-07-20 16:16:24.656881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.887 [2024-07-20 16:16:24.656894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.887 #28 NEW cov: 11715 ft: 14691 corp: 27/77b lim: 5 exec/s: 28 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:56.145 [2024-07-20 16:16:24.697190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.697214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.697266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.697283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.697333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.697346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.697397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.697410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.145 #29 NEW cov: 11715 ft: 14716 corp: 28/81b lim: 5 exec/s: 29 rss: 68Mb L: 4/5 MS: 1 ChangeByte- 00:07:56.145 [2024-07-20 16:16:24.737447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.737471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.737524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.737537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.737587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.737600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.737653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.737666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.737718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.737730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.145 #30 NEW cov: 11715 ft: 14728 corp: 29/86b lim: 5 exec/s: 30 rss: 68Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:56.145 [2024-07-20 16:16:24.777149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.777174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.777225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.777238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.145 #31 NEW cov: 11715 ft: 14738 corp: 30/88b lim: 5 exec/s: 31 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:56.145 [2024-07-20 16:16:24.817124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.817149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.145 #32 NEW cov: 11715 ft: 14742 corp: 31/89b lim: 5 exec/s: 32 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:56.145 [2024-07-20 16:16:24.857800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.857828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.857897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.857910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.857961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.857974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.858022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.858035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.858086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.858100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.145 #33 NEW cov: 11715 ft: 14778 corp: 32/94b lim: 5 exec/s: 33 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:56.145 [2024-07-20 16:16:24.897459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.897483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.897536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.897549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.145 #34 NEW cov: 11715 ft: 14785 corp: 33/96b lim: 5 exec/s: 34 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:07:56.145 [2024-07-20 16:16:24.938050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.938075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.938127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.938140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.938190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.938203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.938251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.938264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.145 [2024-07-20 16:16:24.938314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.145 [2024-07-20 16:16:24.938330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.403 #35 NEW cov: 11715 ft: 14791 corp: 34/101b lim: 5 exec/s: 35 rss: 69Mb L: 5/5 MS: 1 ChangeByte- 00:07:56.403 [2024-07-20 16:16:24.978164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:24.978189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:24.978240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:24.978253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:24.978303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:24.978316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:24.978366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:24.978379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:24.978430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:24.978445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.403 #36 NEW cov: 11715 ft: 14803 corp: 35/106b lim: 5 exec/s: 36 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:56.403 [2024-07-20 16:16:25.018288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.018312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.018363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.018376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.018427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.018440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.018494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.018507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.018557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.018570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.403 #37 NEW cov: 11715 ft: 14811 corp: 36/111b lim: 5 exec/s: 37 rss: 69Mb L: 5/5 MS: 1 ChangeBit- 00:07:56.403 [2024-07-20 16:16:25.057979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.058004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.058071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.058084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.403 #38 NEW cov: 11715 ft: 14820 corp: 37/113b lim: 5 exec/s: 38 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:56.403 [2024-07-20 16:16:25.098244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.098268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.098319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.098333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.098383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.098411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.403 #39 NEW cov: 11715 ft: 14980 corp: 38/116b lim: 5 exec/s: 39 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:56.403 [2024-07-20 16:16:25.138660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.138685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.138736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.138749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.138798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.138811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.138861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.138873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.403 [2024-07-20 16:16:25.138922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.138935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.403 #40 NEW cov: 11715 ft: 14988 corp: 39/121b lim: 5 exec/s: 40 rss: 69Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:56.403 [2024-07-20 16:16:25.178159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.403 [2024-07-20 16:16:25.178184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.403 #41 NEW cov: 11715 ft: 15020 corp: 40/122b lim: 5 exec/s: 41 rss: 69Mb L: 1/5 MS: 1 EraseBytes- 00:07:56.662 [2024-07-20 16:16:25.218455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.218480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.662 [2024-07-20 16:16:25.218534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.218547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.662 [2024-07-20 16:16:25.258564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.258589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.662 [2024-07-20 16:16:25.258657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.258671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.662 #43 NEW cov: 11715 ft: 15089 corp: 41/124b lim: 5 exec/s: 43 rss: 69Mb L: 2/5 MS: 2 CrossOver-CrossOver- 00:07:56.662 [2024-07-20 16:16:25.299128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.299154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.662 [2024-07-20 16:16:25.299207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.299221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.662 [2024-07-20 16:16:25.299272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.299285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.662 [2024-07-20 16:16:25.299335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.299348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.662 [2024-07-20 16:16:25.299397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.299411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.662 #44 NEW cov: 11715 ft: 15091 corp: 42/129b lim: 5 exec/s: 44 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:07:56.662 [2024-07-20 16:16:25.338641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.662 [2024-07-20 16:16:25.338665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.662 #45 NEW cov: 11715 ft: 15099 corp: 43/130b lim: 5 exec/s: 22 rss: 69Mb L: 1/5 MS: 1 EraseBytes- 00:07:56.662 #45 DONE cov: 11715 ft: 15099 corp: 43/130b lim: 5 exec/s: 22 rss: 69Mb 00:07:56.662 ###### Recommended dictionary. ###### 00:07:56.662 "\376\377" # Uses: 1 00:07:56.662 ###### End of recommended dictionary. ###### 00:07:56.662 Done 45 runs in 2 second(s) 00:07:56.921 16:16:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:56.921 16:16:25 -- ../common.sh@72 -- # (( i++ )) 00:07:56.921 16:16:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.921 16:16:25 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:56.921 16:16:25 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:56.921 16:16:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:56.921 16:16:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:56.921 16:16:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:56.921 16:16:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:56.921 16:16:25 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:56.921 16:16:25 -- nvmf/run.sh@29 -- # port=4409 00:07:56.921 16:16:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:56.921 16:16:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:56.921 16:16:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:56.921 16:16:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:56.921 [2024-07-20 16:16:25.515941] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:56.921 [2024-07-20 16:16:25.516010] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2273442 ] 00:07:56.921 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.921 [2024-07-20 16:16:25.699315] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.921 [2024-07-20 16:16:25.719527] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:56.921 [2024-07-20 16:16:25.719666] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.180 [2024-07-20 16:16:25.771000] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.180 [2024-07-20 16:16:25.787312] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:57.180 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.180 INFO: Seed: 1225945802 00:07:57.180 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:57.180 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:57.180 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:57.180 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.180 [2024-07-20 16:16:25.832588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.180 [2024-07-20 16:16:25.832617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.180 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 64Mb 00:07:57.180 [2024-07-20 16:16:25.862621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.180 [2024-07-20 16:16:25.862647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.180 #3 NEW cov: 11601 ft: 12017 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:57.180 [2024-07-20 16:16:25.902659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.180 [2024-07-20 16:16:25.902686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.180 #4 NEW cov: 11607 ft: 12085 corp: 3/3b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeBinInt- 00:07:57.180 [2024-07-20 16:16:25.942782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.180 [2024-07-20 16:16:25.942808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.180 #5 NEW cov: 11692 ft: 12456 corp: 4/4b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 CrossOver- 00:07:57.180 [2024-07-20 16:16:25.982873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.180 [2024-07-20 16:16:25.982898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.439 #6 NEW cov: 11692 ft: 12653 corp: 5/5b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:57.439 [2024-07-20 16:16:26.022994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.023020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.439 #7 NEW cov: 11692 ft: 12747 corp: 6/6b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:57.439 [2024-07-20 16:16:26.063294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.063320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.439 [2024-07-20 16:16:26.063376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.063390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.439 #8 NEW cov: 11692 ft: 13532 corp: 7/8b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:57.439 [2024-07-20 16:16:26.103257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.103282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.439 #9 NEW cov: 11692 ft: 13575 corp: 8/9b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeBinInt- 00:07:57.439 [2024-07-20 16:16:26.143545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.143570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.439 [2024-07-20 16:16:26.143624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.143637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.439 #10 NEW cov: 11692 ft: 13710 corp: 9/11b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:57.439 [2024-07-20 16:16:26.183667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.183692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.439 [2024-07-20 16:16:26.183748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.183764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.439 #11 NEW cov: 11692 ft: 13836 corp: 10/13b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:57.439 [2024-07-20 16:16:26.223561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.439 [2024-07-20 16:16:26.223586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.439 #12 NEW cov: 11692 ft: 13893 corp: 11/14b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:57.699 [2024-07-20 16:16:26.254165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.699 [2024-07-20 16:16:26.254190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.699 [2024-07-20 16:16:26.254246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.699 [2024-07-20 16:16:26.254260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.699 [2024-07-20 16:16:26.254317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.699 [2024-07-20 16:16:26.254330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.699 [2024-07-20 16:16:26.254387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.699 [2024-07-20 16:16:26.254400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.699 #13 NEW cov: 11692 ft: 14207 corp: 12/18b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:57.699 [2024-07-20 16:16:26.293948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.699 [2024-07-20 16:16:26.293972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.699 [2024-07-20 16:16:26.294030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.699 [2024-07-20 16:16:26.294043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.700 #14 NEW cov: 11692 ft: 14254 corp: 13/20b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 ChangeByte- 00:07:57.700 [2024-07-20 16:16:26.333939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.333964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.700 #15 NEW cov: 11692 ft: 14262 corp: 14/21b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeByte- 00:07:57.700 [2024-07-20 16:16:26.374056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.374081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.700 #16 NEW cov: 11692 ft: 14368 corp: 15/22b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeByte- 00:07:57.700 [2024-07-20 16:16:26.414348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.414375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.700 [2024-07-20 16:16:26.414430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.414448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.700 #17 NEW cov: 11692 ft: 14373 corp: 16/24b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 CopyPart- 00:07:57.700 [2024-07-20 16:16:26.454295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.454320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.700 #18 NEW cov: 11692 ft: 14447 corp: 17/25b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 CrossOver- 00:07:57.700 [2024-07-20 16:16:26.494954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.494979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.700 [2024-07-20 16:16:26.495036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.495050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.700 [2024-07-20 16:16:26.495105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.495119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.700 [2024-07-20 16:16:26.495174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.700 [2024-07-20 16:16:26.495187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.959 #19 NEW cov: 11692 ft: 14506 corp: 18/29b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:57.959 [2024-07-20 16:16:26.534690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.959 [2024-07-20 16:16:26.534716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.959 [2024-07-20 16:16:26.534772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.959 [2024-07-20 16:16:26.534786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.959 #20 NEW cov: 11692 ft: 14522 corp: 19/31b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 EraseBytes- 00:07:57.959 [2024-07-20 16:16:26.574814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.959 [2024-07-20 16:16:26.574840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.959 [2024-07-20 16:16:26.574898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.959 [2024-07-20 16:16:26.574912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.959 #21 NEW cov: 11692 ft: 14532 corp: 20/33b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 InsertByte- 00:07:57.959 [2024-07-20 16:16:26.614788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.959 [2024-07-20 16:16:26.614814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.959 #22 NEW cov: 11692 ft: 14565 corp: 21/34b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeBit- 00:07:57.959 [2024-07-20 16:16:26.655551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.959 [2024-07-20 16:16:26.655578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.959 [2024-07-20 16:16:26.655635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.655649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.960 [2024-07-20 16:16:26.655704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.655718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.960 [2024-07-20 16:16:26.655776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.655789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.960 [2024-07-20 16:16:26.655846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.655859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.960 #23 NEW cov: 11692 ft: 14632 corp: 22/39b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:57.960 [2024-07-20 16:16:26.695661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.695685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.960 [2024-07-20 16:16:26.695743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.695756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.960 [2024-07-20 16:16:26.695814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.695828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.960 [2024-07-20 16:16:26.695885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.695898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.960 [2024-07-20 16:16:26.695957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.960 [2024-07-20 16:16:26.695970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.219 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.219 #24 NEW cov: 11715 ft: 14641 corp: 23/44b lim: 5 exec/s: 24 rss: 69Mb L: 5/5 MS: 1 ChangeByte- 00:07:58.219 [2024-07-20 16:16:26.996143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.219 [2024-07-20 16:16:26.996182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.219 [2024-07-20 16:16:26.996244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.219 [2024-07-20 16:16:26.996261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.219 [2024-07-20 16:16:26.996322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.219 [2024-07-20 16:16:26.996337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.480 #25 NEW cov: 11715 ft: 14805 corp: 24/47b lim: 5 exec/s: 25 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:58.480 [2024-07-20 16:16:27.045751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.045780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.480 #26 NEW cov: 11715 ft: 14824 corp: 25/48b lim: 5 exec/s: 26 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:58.480 [2024-07-20 16:16:27.086083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.086108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.086162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.086176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.480 #27 NEW cov: 11715 ft: 14836 corp: 26/50b lim: 5 exec/s: 27 rss: 69Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:58.480 [2024-07-20 16:16:27.126284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.126310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.126364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.126378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.126434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.126453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.480 #28 NEW cov: 11715 ft: 14854 corp: 27/53b lim: 5 exec/s: 28 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:58.480 [2024-07-20 16:16:27.166740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.166768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.166840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.166854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.166909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.166922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.166974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.166987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.167041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.167055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.480 #29 NEW cov: 11715 ft: 14889 corp: 28/58b lim: 5 exec/s: 29 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:58.480 [2024-07-20 16:16:27.206411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.206437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.206498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.206512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.480 #30 NEW cov: 11715 ft: 14899 corp: 29/60b lim: 5 exec/s: 30 rss: 69Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:58.480 [2024-07-20 16:16:27.246519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.246544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.480 [2024-07-20 16:16:27.246599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.480 [2024-07-20 16:16:27.246612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.480 #31 NEW cov: 11715 ft: 14917 corp: 30/62b lim: 5 exec/s: 31 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:58.741 [2024-07-20 16:16:27.286639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.286665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.286723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.286737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.741 #32 NEW cov: 11715 ft: 14921 corp: 31/64b lim: 5 exec/s: 32 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:58.741 [2024-07-20 16:16:27.326870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.326894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.326951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.326964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.327020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.327033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.741 #33 NEW cov: 11715 ft: 14930 corp: 32/67b lim: 5 exec/s: 33 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:58.741 [2024-07-20 16:16:27.366840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.366864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.366933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.366947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.741 #34 NEW cov: 11715 ft: 14940 corp: 33/69b lim: 5 exec/s: 34 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:58.741 [2024-07-20 16:16:27.407277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.407303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.407358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.407371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.407425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.407438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.407495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.407508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.741 #35 NEW cov: 11715 ft: 14952 corp: 34/73b lim: 5 exec/s: 35 rss: 69Mb L: 4/5 MS: 1 ChangeBit- 00:07:58.741 [2024-07-20 16:16:27.447101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.447126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.447181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.447195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.741 #36 NEW cov: 11715 ft: 14980 corp: 35/75b lim: 5 exec/s: 36 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:58.741 [2024-07-20 16:16:27.487675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.487699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.487751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.487765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.487817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.487830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.487883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.487896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.487953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.487966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.741 #37 NEW cov: 11715 ft: 14999 corp: 36/80b lim: 5 exec/s: 37 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:58.741 [2024-07-20 16:16:27.527776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.527800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.527852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.527865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.527918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.527947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.528004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.528017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.741 [2024-07-20 16:16:27.528069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.741 [2024-07-20 16:16:27.528082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.000 #38 NEW cov: 11715 ft: 15031 corp: 37/85b lim: 5 exec/s: 38 rss: 70Mb L: 5/5 MS: 1 InsertByte- 00:07:59.000 [2024-07-20 16:16:27.567437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.567465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.000 [2024-07-20 16:16:27.567539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.567553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.000 #39 NEW cov: 11715 ft: 15038 corp: 38/87b lim: 5 exec/s: 39 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:59.000 [2024-07-20 16:16:27.608040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.608064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.000 [2024-07-20 16:16:27.608119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.608132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.000 [2024-07-20 16:16:27.608185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.608199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.000 [2024-07-20 16:16:27.608254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.608267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.000 [2024-07-20 16:16:27.608320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.608332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.000 #40 NEW cov: 11715 ft: 15050 corp: 39/92b lim: 5 exec/s: 40 rss: 70Mb L: 5/5 MS: 1 ChangeByte- 00:07:59.000 [2024-07-20 16:16:27.647690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.647715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.000 [2024-07-20 16:16:27.647765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.647779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.000 #41 NEW cov: 11715 ft: 15066 corp: 40/94b lim: 5 exec/s: 41 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:59.000 [2024-07-20 16:16:27.687833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.000 [2024-07-20 16:16:27.687860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.001 [2024-07-20 16:16:27.687912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.001 [2024-07-20 16:16:27.687926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.001 #42 NEW cov: 11715 ft: 15070 corp: 41/96b lim: 5 exec/s: 42 rss: 70Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:59.001 [2024-07-20 16:16:27.728390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.001 [2024-07-20 16:16:27.728418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.001 [2024-07-20 16:16:27.728471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.001 [2024-07-20 16:16:27.728485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.001 [2024-07-20 16:16:27.728537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.001 [2024-07-20 16:16:27.728550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.001 [2024-07-20 16:16:27.728604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.001 [2024-07-20 16:16:27.728616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.001 [2024-07-20 16:16:27.728668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.001 [2024-07-20 16:16:27.728680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.001 #43 NEW cov: 11715 ft: 15103 corp: 42/101b lim: 5 exec/s: 43 rss: 70Mb L: 5/5 MS: 1 ChangeBit- 00:07:59.001 [2024-07-20 16:16:27.768046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.001 [2024-07-20 16:16:27.768072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.001 [2024-07-20 16:16:27.768140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.001 [2024-07-20 16:16:27.768154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.001 #44 NEW cov: 11715 ft: 15126 corp: 43/103b lim: 5 exec/s: 44 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:59.259 [2024-07-20 16:16:27.808647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.259 [2024-07-20 16:16:27.808672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.260 [2024-07-20 16:16:27.808723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.260 [2024-07-20 16:16:27.808736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.260 [2024-07-20 16:16:27.808789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.260 [2024-07-20 16:16:27.808818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.260 [2024-07-20 16:16:27.808868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.260 [2024-07-20 16:16:27.808881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.260 [2024-07-20 16:16:27.808932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.260 [2024-07-20 16:16:27.808948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.260 #45 NEW cov: 11715 ft: 15133 corp: 44/108b lim: 5 exec/s: 22 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:59.260 #45 DONE cov: 11715 ft: 15133 corp: 44/108b lim: 5 exec/s: 22 rss: 70Mb 00:07:59.260 Done 45 runs in 2 second(s) 00:07:59.260 16:16:27 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:59.260 16:16:27 -- ../common.sh@72 -- # (( i++ )) 00:07:59.260 16:16:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.260 16:16:27 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:59.260 16:16:27 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:59.260 16:16:27 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.260 16:16:27 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.260 16:16:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:59.260 16:16:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:59.260 16:16:27 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:59.260 16:16:27 -- nvmf/run.sh@29 -- # port=4410 00:07:59.260 16:16:27 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:59.260 16:16:27 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:59.260 16:16:27 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.260 16:16:27 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:59.260 [2024-07-20 16:16:27.983719] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:07:59.260 [2024-07-20 16:16:27.983791] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2273862 ] 00:07:59.260 EAL: No free 2048 kB hugepages reported on node 1 00:07:59.519 [2024-07-20 16:16:28.158999] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.519 [2024-07-20 16:16:28.178255] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:59.519 [2024-07-20 16:16:28.178376] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.519 [2024-07-20 16:16:28.229711] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.519 [2024-07-20 16:16:28.246040] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:59.519 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.519 INFO: Seed: 3684954649 00:07:59.519 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:07:59.519 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:07:59.519 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:59.519 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.519 #2 INITED exec/s: 0 rss: 59Mb 00:07:59.519 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.519 This may also happen if the target rejected all inputs we tried so far 00:07:59.519 [2024-07-20 16:16:28.291523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.519 [2024-07-20 16:16:28.291553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.519 [2024-07-20 16:16:28.291607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.519 [2024-07-20 16:16:28.291621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.519 [2024-07-20 16:16:28.291685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.519 [2024-07-20 16:16:28.291698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.519 [2024-07-20 16:16:28.291753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.519 [2024-07-20 16:16:28.291766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.085 NEW_FUNC[1/670]: 0x49e620 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:00.085 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:00.085 #4 NEW cov: 11511 ft: 11512 corp: 2/38b lim: 40 exec/s: 0 rss: 66Mb L: 37/37 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:00.085 [2024-07-20 16:16:28.602427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.602465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.085 [2024-07-20 16:16:28.602521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.602535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.085 [2024-07-20 16:16:28.602594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.602624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.085 [2024-07-20 16:16:28.602682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.602695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.085 [2024-07-20 16:16:28.602755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.602768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.085 #5 NEW cov: 11624 ft: 11980 corp: 3/78b lim: 40 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 CopyPart- 00:08:00.085 [2024-07-20 16:16:28.652480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.652508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.085 [2024-07-20 16:16:28.652566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.652580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.085 [2024-07-20 16:16:28.652636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.652650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.085 [2024-07-20 16:16:28.652710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.652723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.085 [2024-07-20 16:16:28.652782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.652795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.085 #6 NEW cov: 11630 ft: 12251 corp: 4/118b lim: 40 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 CopyPart- 00:08:00.085 [2024-07-20 16:16:28.692638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.085 [2024-07-20 16:16:28.692665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.692721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.692734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.692789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.692802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.692857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.692871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.692927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.692941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.086 #7 NEW cov: 11715 ft: 12539 corp: 5/158b lim: 40 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:00.086 [2024-07-20 16:16:28.732722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:29000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.732749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.732808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.732821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.732874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.732888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.732942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.732955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.733012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.733026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.086 #13 NEW cov: 11715 ft: 12703 corp: 6/198b lim: 40 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 ChangeByte- 00:08:00.086 [2024-07-20 16:16:28.772860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.772886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.772943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.772957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.773013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.773026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.773080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.773094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.773149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.773163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.086 #14 NEW cov: 11715 ft: 12828 corp: 7/238b lim: 40 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:00.086 [2024-07-20 16:16:28.812996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.813021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.813092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.813105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.813160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.813173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.813230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000fd00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.813243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.813300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.813314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.086 #15 NEW cov: 11715 ft: 12884 corp: 8/278b lim: 40 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:00.086 [2024-07-20 16:16:28.853074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.853101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.853159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.853173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.853247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.853261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.853318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:04000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.853332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.086 [2024-07-20 16:16:28.853390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.086 [2024-07-20 16:16:28.853403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.086 #16 NEW cov: 11715 ft: 12974 corp: 9/318b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:00.345 [2024-07-20 16:16:28.893242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.893268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.893327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.893340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.893398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.893411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.893468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.893482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.893539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.893552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.345 #17 NEW cov: 11715 ft: 13005 corp: 10/358b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CrossOver- 00:08:00.345 [2024-07-20 16:16:28.933052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.933078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.933140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.933155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.933212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.933225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.345 #18 NEW cov: 11715 ft: 13502 corp: 11/387b lim: 40 exec/s: 0 rss: 67Mb L: 29/40 MS: 1 EraseBytes- 00:08:00.345 [2024-07-20 16:16:28.973318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.973343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.973403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.973417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.973476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.973489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:28.973548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:28.973560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.345 #19 NEW cov: 11715 ft: 13521 corp: 12/419b lim: 40 exec/s: 0 rss: 67Mb L: 32/40 MS: 1 EraseBytes- 00:08:00.345 [2024-07-20 16:16:29.013447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08100000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.013472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.013530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.013544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.013602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.013615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.013669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.013682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.345 #20 NEW cov: 11715 ft: 13560 corp: 13/456b lim: 40 exec/s: 0 rss: 67Mb L: 37/40 MS: 1 ChangeBit- 00:08:00.345 [2024-07-20 16:16:29.053672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.053697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.053755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.053769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.053828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.053842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.053897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.053910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.053967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.053980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.345 #21 NEW cov: 11715 ft: 13601 corp: 14/496b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CrossOver- 00:08:00.345 [2024-07-20 16:16:29.093541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.093566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.093625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.093639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.093696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.093709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.345 #22 NEW cov: 11715 ft: 13615 corp: 15/523b lim: 40 exec/s: 0 rss: 67Mb L: 27/40 MS: 1 EraseBytes- 00:08:00.345 [2024-07-20 16:16:29.133867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.133892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.133949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.133963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.134036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.134050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.134108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.134121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.345 [2024-07-20 16:16:29.134181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.345 [2024-07-20 16:16:29.134195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.603 #23 NEW cov: 11715 ft: 13642 corp: 16/563b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeBit- 00:08:00.603 [2024-07-20 16:16:29.173748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.173773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.173845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000800 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.173859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.173917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.173930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.603 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:00.603 #24 NEW cov: 11738 ft: 13694 corp: 17/590b lim: 40 exec/s: 0 rss: 67Mb L: 27/40 MS: 1 ChangeBit- 00:08:00.603 [2024-07-20 16:16:29.214133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.214158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.214216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.214230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.214285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.214298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.214355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.214368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.214428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00001000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.214445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.603 #25 NEW cov: 11738 ft: 13703 corp: 18/630b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeBit- 00:08:00.603 [2024-07-20 16:16:29.254212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.254238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.254296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.254312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.254371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.254384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.603 [2024-07-20 16:16:29.254444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000fd00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.603 [2024-07-20 16:16:29.254458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.254512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00280000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.254525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.604 #26 NEW cov: 11738 ft: 13706 corp: 19/670b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:00.604 [2024-07-20 16:16:29.294354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00002800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.294379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.294438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.294455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.294512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.294525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.294582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.294595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.294652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.294665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.604 #27 NEW cov: 11738 ft: 13725 corp: 20/710b lim: 40 exec/s: 27 rss: 67Mb L: 40/40 MS: 1 ChangeByte- 00:08:00.604 [2024-07-20 16:16:29.334396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.334422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.334483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.334497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.334558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:f5000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.334575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.334633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.334646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.604 #28 NEW cov: 11738 ft: 13749 corp: 21/742b lim: 40 exec/s: 28 rss: 67Mb L: 32/40 MS: 1 ChangeByte- 00:08:00.604 [2024-07-20 16:16:29.374228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.374253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.604 [2024-07-20 16:16:29.374312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00080000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.604 [2024-07-20 16:16:29.374325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.604 #29 NEW cov: 11738 ft: 13999 corp: 22/764b lim: 40 exec/s: 29 rss: 67Mb L: 22/40 MS: 1 CrossOver- 00:08:00.862 [2024-07-20 16:16:29.414818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.414843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.414919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.414932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.414989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.415002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.415059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00820000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.415073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.415128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.415142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.862 #30 NEW cov: 11738 ft: 14005 corp: 23/804b lim: 40 exec/s: 30 rss: 67Mb L: 40/40 MS: 1 CMP- DE: "\000\202"- 00:08:00.862 [2024-07-20 16:16:29.454791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.454816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.454877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.454890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.454949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.454965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.455021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.455034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.455090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.455103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.862 #31 NEW cov: 11738 ft: 14045 corp: 24/844b lim: 40 exec/s: 31 rss: 67Mb L: 40/40 MS: 1 ChangeBit- 00:08:00.862 [2024-07-20 16:16:29.484378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.484403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.862 #32 NEW cov: 11738 ft: 14381 corp: 25/858b lim: 40 exec/s: 32 rss: 67Mb L: 14/40 MS: 1 InsertRepeatedBytes- 00:08:00.862 [2024-07-20 16:16:29.524890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.524915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.524975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:c0000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.524989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.525048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.525061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.525115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.525128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.862 #33 NEW cov: 11738 ft: 14421 corp: 26/896b lim: 40 exec/s: 33 rss: 67Mb L: 38/40 MS: 1 InsertByte- 00:08:00.862 [2024-07-20 16:16:29.565057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.565083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.565143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.565157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.565217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.565231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.565292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:fd000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.565308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.862 #34 NEW cov: 11738 ft: 14430 corp: 27/930b lim: 40 exec/s: 34 rss: 67Mb L: 34/40 MS: 1 EraseBytes- 00:08:00.862 [2024-07-20 16:16:29.595196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.595221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.595276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.595289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.595346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.595358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.595413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.595426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.595500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.595514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.862 #35 NEW cov: 11738 ft: 14461 corp: 28/970b lim: 40 exec/s: 35 rss: 67Mb L: 40/40 MS: 1 CrossOver- 00:08:00.862 [2024-07-20 16:16:29.635321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000100 cdw11:0000a100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.635346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.635405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.635426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.635487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.635499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.635556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.635569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.862 [2024-07-20 16:16:29.635625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.862 [2024-07-20 16:16:29.635638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.862 #36 NEW cov: 11738 ft: 14473 corp: 29/1010b lim: 40 exec/s: 36 rss: 68Mb L: 40/40 MS: 1 ChangeByte- 00:08:01.120 [2024-07-20 16:16:29.675421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:29000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.675449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.675503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.675516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.675571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.675601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.675642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.675655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.675710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.675724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.120 #37 NEW cov: 11738 ft: 14484 corp: 30/1050b lim: 40 exec/s: 37 rss: 68Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:01.120 [2024-07-20 16:16:29.715387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.715414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.715472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.715486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.715543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:17171717 cdw11:17171700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.715556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.715610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000600 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.715623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.120 #38 NEW cov: 11738 ft: 14509 corp: 31/1089b lim: 40 exec/s: 38 rss: 68Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:08:01.120 [2024-07-20 16:16:29.755406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.755431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.755505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.755525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.755582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.755595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.120 #39 NEW cov: 11738 ft: 14516 corp: 32/1118b lim: 40 exec/s: 39 rss: 68Mb L: 29/40 MS: 1 ChangeBit- 00:08:01.120 [2024-07-20 16:16:29.795773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.795798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.795871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.795885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.795942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.795955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.796012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.796025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.796079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.796093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.120 #40 NEW cov: 11738 ft: 14526 corp: 33/1158b lim: 40 exec/s: 40 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:08:01.120 [2024-07-20 16:16:29.835764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.835790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.835850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:98980800 cdw11:04000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.835864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.835923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.835937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.835996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.836010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.120 #41 NEW cov: 11738 ft: 14538 corp: 34/1197b lim: 40 exec/s: 41 rss: 68Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:08:01.120 [2024-07-20 16:16:29.876012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.876038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.876099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.876112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.876167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.876180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.876236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.876251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.876309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.876324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.120 #42 NEW cov: 11738 ft: 14596 corp: 35/1237b lim: 40 exec/s: 42 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:08:01.120 [2024-07-20 16:16:29.915774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.915800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.120 [2024-07-20 16:16:29.915855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.120 [2024-07-20 16:16:29.915868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.379 #43 NEW cov: 11738 ft: 14607 corp: 36/1257b lim: 40 exec/s: 43 rss: 68Mb L: 20/40 MS: 1 EraseBytes- 00:08:01.379 [2024-07-20 16:16:29.956119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.956145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:29.956204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.956218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:29.956291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000000fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.956305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:29.956363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:000000fd cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.956376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.379 #44 NEW cov: 11738 ft: 14622 corp: 37/1294b lim: 40 exec/s: 44 rss: 68Mb L: 37/40 MS: 1 CopyPart- 00:08:01.379 [2024-07-20 16:16:29.996370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.996396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:29.996478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.996493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:29.996549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.996562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:29.996620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.996633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:29.996688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00001000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:29.996702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.379 #50 NEW cov: 11738 ft: 14633 corp: 38/1334b lim: 40 exec/s: 50 rss: 68Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:01.379 [2024-07-20 16:16:30.036145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.036172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:30.036226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00080000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.036240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.379 #51 NEW cov: 11738 ft: 14647 corp: 39/1356b lim: 40 exec/s: 51 rss: 68Mb L: 22/40 MS: 1 ChangeByte- 00:08:01.379 [2024-07-20 16:16:30.076284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.076320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:30.076380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00080000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.076394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.379 #52 NEW cov: 11738 ft: 14678 corp: 40/1376b lim: 40 exec/s: 52 rss: 68Mb L: 20/40 MS: 1 EraseBytes- 00:08:01.379 [2024-07-20 16:16:30.116775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.116803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:30.116876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000008 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.116890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:30.116950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.116970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:30.117027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00820000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.117041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:30.117098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.117112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.379 #53 NEW cov: 11738 ft: 14694 corp: 41/1416b lim: 40 exec/s: 53 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:01.379 [2024-07-20 16:16:30.156460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.156486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.379 [2024-07-20 16:16:30.156563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00080000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.379 [2024-07-20 16:16:30.156577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.638 #54 NEW cov: 11738 ft: 14719 corp: 42/1436b lim: 40 exec/s: 54 rss: 69Mb L: 20/40 MS: 1 ShuffleBytes- 00:08:01.638 [2024-07-20 16:16:30.196975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.197001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.197076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000008 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.197090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.197148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.197162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.197219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.197232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.197290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.197303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.638 #55 NEW cov: 11738 ft: 14728 corp: 43/1476b lim: 40 exec/s: 55 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:08:01.638 [2024-07-20 16:16:30.237100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.237126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.237198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.237215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.237270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.237284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.237343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.237357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.237412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00001000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.237425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.638 #56 NEW cov: 11738 ft: 14738 corp: 44/1516b lim: 40 exec/s: 56 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:08:01.638 [2024-07-20 16:16:30.277206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000600 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.277232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.277304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.277318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.277378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.277392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.277453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.277466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.638 [2024-07-20 16:16:30.277525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.638 [2024-07-20 16:16:30.277538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.638 #57 NEW cov: 11738 ft: 14743 corp: 45/1556b lim: 40 exec/s: 28 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:08:01.638 #57 DONE cov: 11738 ft: 14743 corp: 45/1556b lim: 40 exec/s: 28 rss: 69Mb 00:08:01.638 ###### Recommended dictionary. ###### 00:08:01.638 "\000\202" # Uses: 1 00:08:01.638 ###### End of recommended dictionary. ###### 00:08:01.638 Done 57 runs in 2 second(s) 00:08:01.638 16:16:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:08:01.638 16:16:30 -- ../common.sh@72 -- # (( i++ )) 00:08:01.638 16:16:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.638 16:16:30 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:01.638 16:16:30 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:01.638 16:16:30 -- nvmf/run.sh@24 -- # local timen=1 00:08:01.638 16:16:30 -- nvmf/run.sh@25 -- # local core=0x1 00:08:01.638 16:16:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:01.638 16:16:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:01.638 16:16:30 -- nvmf/run.sh@29 -- # printf %02d 11 00:08:01.638 16:16:30 -- nvmf/run.sh@29 -- # port=4411 00:08:01.638 16:16:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:01.638 16:16:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:01.638 16:16:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:01.638 16:16:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:01.896 [2024-07-20 16:16:30.451433] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:01.896 [2024-07-20 16:16:30.451508] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2274399 ] 00:08:01.896 EAL: No free 2048 kB hugepages reported on node 1 00:08:01.896 [2024-07-20 16:16:30.625015] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.896 [2024-07-20 16:16:30.644555] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:01.896 [2024-07-20 16:16:30.644676] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.896 [2024-07-20 16:16:30.696014] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.157 [2024-07-20 16:16:30.712354] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:02.157 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.157 INFO: Seed: 1855992724 00:08:02.157 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:02.157 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:02.157 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:02.157 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.157 #2 INITED exec/s: 0 rss: 59Mb 00:08:02.157 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.157 This may also happen if the target rejected all inputs we tried so far 00:08:02.157 [2024-07-20 16:16:30.757581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.157 [2024-07-20 16:16:30.757611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.415 NEW_FUNC[1/671]: 0x4a0390 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:02.415 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:02.415 #7 NEW cov: 11515 ft: 11524 corp: 2/16b lim: 40 exec/s: 0 rss: 66Mb L: 15/15 MS: 5 ShuffleBytes-ChangeBit-InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:02.415 [2024-07-20 16:16:31.068571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.068604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.415 [2024-07-20 16:16:31.068658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff2525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.068671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.415 [2024-07-20 16:16:31.068724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:252525ff cdw11:ffffff31 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.068741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.415 #8 NEW cov: 11636 ft: 12721 corp: 3/40b lim: 40 exec/s: 0 rss: 66Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:02.415 [2024-07-20 16:16:31.118641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a5b3838 cdw11:38383838 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.118668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.415 [2024-07-20 16:16:31.118724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:38383838 cdw11:38383838 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.118738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.415 [2024-07-20 16:16:31.118790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:38383838 cdw11:38383838 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.118803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.415 #10 NEW cov: 11642 ft: 12970 corp: 4/65b lim: 40 exec/s: 0 rss: 66Mb L: 25/25 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:02.415 [2024-07-20 16:16:31.158414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.158440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.415 #11 NEW cov: 11727 ft: 13221 corp: 5/76b lim: 40 exec/s: 0 rss: 66Mb L: 11/25 MS: 1 CrossOver- 00:08:02.415 [2024-07-20 16:16:31.198719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.198746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.415 [2024-07-20 16:16:31.198803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.415 [2024-07-20 16:16:31.198817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.674 #12 NEW cov: 11727 ft: 13462 corp: 6/98b lim: 40 exec/s: 0 rss: 66Mb L: 22/25 MS: 1 CrossOver- 00:08:02.674 [2024-07-20 16:16:31.238838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.238863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.674 [2024-07-20 16:16:31.238918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.238932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.674 #13 NEW cov: 11727 ft: 13659 corp: 7/119b lim: 40 exec/s: 0 rss: 66Mb L: 21/25 MS: 1 EraseBytes- 00:08:02.674 [2024-07-20 16:16:31.279125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff28ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.279150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.674 [2024-07-20 16:16:31.279204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff25 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.279217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.674 [2024-07-20 16:16:31.279273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.279286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.674 #14 NEW cov: 11727 ft: 13783 corp: 8/144b lim: 40 exec/s: 0 rss: 66Mb L: 25/25 MS: 1 InsertByte- 00:08:02.674 [2024-07-20 16:16:31.318933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.318959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.674 #15 NEW cov: 11727 ft: 13787 corp: 9/159b lim: 40 exec/s: 0 rss: 66Mb L: 15/25 MS: 1 InsertRepeatedBytes- 00:08:02.674 [2024-07-20 16:16:31.359344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.359369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.674 [2024-07-20 16:16:31.359423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.359436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.674 [2024-07-20 16:16:31.359495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.359508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.674 #16 NEW cov: 11727 ft: 13817 corp: 10/189b lim: 40 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:02.674 [2024-07-20 16:16:31.399106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffe cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.399132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.674 #17 NEW cov: 11727 ft: 13929 corp: 11/200b lim: 40 exec/s: 0 rss: 67Mb L: 11/30 MS: 1 ChangeByte- 00:08:02.674 [2024-07-20 16:16:31.429570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.429597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.674 [2024-07-20 16:16:31.429654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff2525 cdw11:252525ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.429668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.674 [2024-07-20 16:16:31.429722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffff31 cdw11:ffffff31 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.429735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.674 #18 NEW cov: 11727 ft: 13947 corp: 12/224b lim: 40 exec/s: 0 rss: 67Mb L: 24/30 MS: 1 CrossOver- 00:08:02.674 [2024-07-20 16:16:31.469354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:caffffff cdw11:feffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.674 [2024-07-20 16:16:31.469380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.932 #19 NEW cov: 11727 ft: 14019 corp: 13/236b lim: 40 exec/s: 0 rss: 67Mb L: 12/30 MS: 1 InsertByte- 00:08:02.932 [2024-07-20 16:16:31.509767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.509793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.932 [2024-07-20 16:16:31.509850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff2525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.509863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.932 [2024-07-20 16:16:31.509919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:252525ff cdw11:ff1eff31 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.509932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.932 #20 NEW cov: 11727 ft: 14041 corp: 14/260b lim: 40 exec/s: 0 rss: 67Mb L: 24/30 MS: 1 ChangeByte- 00:08:02.932 [2024-07-20 16:16:31.549574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffe cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.549607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.932 #21 NEW cov: 11727 ft: 14071 corp: 15/268b lim: 40 exec/s: 0 rss: 67Mb L: 8/30 MS: 1 EraseBytes- 00:08:02.932 [2024-07-20 16:16:31.589845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.589871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.932 [2024-07-20 16:16:31.589929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.589943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.932 #22 NEW cov: 11727 ft: 14109 corp: 16/284b lim: 40 exec/s: 0 rss: 67Mb L: 16/30 MS: 1 EraseBytes- 00:08:02.932 [2024-07-20 16:16:31.630272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.630298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.932 [2024-07-20 16:16:31.630355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.630369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.932 [2024-07-20 16:16:31.630423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.630437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.932 [2024-07-20 16:16:31.630497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.630510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.932 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:02.932 #25 NEW cov: 11750 ft: 14413 corp: 17/320b lim: 40 exec/s: 0 rss: 67Mb L: 36/36 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:08:02.932 [2024-07-20 16:16:31.680238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:caffffff cdw11:feffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.680267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.932 [2024-07-20 16:16:31.680325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.680338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.932 [2024-07-20 16:16:31.680393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.680406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.932 #26 NEW cov: 11750 ft: 14433 corp: 18/349b lim: 40 exec/s: 0 rss: 67Mb L: 29/36 MS: 1 InsertRepeatedBytes- 00:08:02.932 [2024-07-20 16:16:31.720063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ff63ffff cdw11:feffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.932 [2024-07-20 16:16:31.720088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.190 #27 NEW cov: 11750 ft: 14463 corp: 19/358b lim: 40 exec/s: 27 rss: 67Mb L: 9/36 MS: 1 InsertByte- 00:08:03.190 [2024-07-20 16:16:31.760465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a5b38ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.190 [2024-07-20 16:16:31.760492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.190 [2024-07-20 16:16:31.760550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0affffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.190 [2024-07-20 16:16:31.760563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.190 [2024-07-20 16:16:31.760617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffff3838 cdw11:ff383838 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.190 [2024-07-20 16:16:31.760631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.190 #28 NEW cov: 11750 ft: 14487 corp: 20/382b lim: 40 exec/s: 28 rss: 67Mb L: 24/36 MS: 1 CrossOver- 00:08:03.190 [2024-07-20 16:16:31.800602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff28ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.190 [2024-07-20 16:16:31.800628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.190 [2024-07-20 16:16:31.800686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff25 cdw11:2a252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.190 [2024-07-20 16:16:31.800699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.190 [2024-07-20 16:16:31.800756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.800769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.191 #29 NEW cov: 11750 ft: 14508 corp: 21/407b lim: 40 exec/s: 29 rss: 68Mb L: 25/36 MS: 1 ChangeBinInt- 00:08:03.191 [2024-07-20 16:16:31.840730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.840756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.191 [2024-07-20 16:16:31.840817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff2525 cdw11:252525ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.840831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.191 [2024-07-20 16:16:31.840900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffff0a cdw11:ffffff31 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.840914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.191 #30 NEW cov: 11750 ft: 14516 corp: 22/431b lim: 40 exec/s: 30 rss: 68Mb L: 24/36 MS: 1 CrossOver- 00:08:03.191 [2024-07-20 16:16:31.880631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.880657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.191 [2024-07-20 16:16:31.880715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.880728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.191 #31 NEW cov: 11750 ft: 14570 corp: 23/452b lim: 40 exec/s: 31 rss: 68Mb L: 21/36 MS: 1 ChangeASCIIInt- 00:08:03.191 [2024-07-20 16:16:31.920929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff8cff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.920955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.191 [2024-07-20 16:16:31.921008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.921022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.191 [2024-07-20 16:16:31.921074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:8c8c8cff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.921088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.191 #32 NEW cov: 11750 ft: 14584 corp: 24/476b lim: 40 exec/s: 32 rss: 68Mb L: 24/36 MS: 1 CopyPart- 00:08:03.191 [2024-07-20 16:16:31.961038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff8cff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.961063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.191 [2024-07-20 16:16:31.961114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:8c8c328c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.961128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.191 [2024-07-20 16:16:31.961180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:8c8c8cff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.191 [2024-07-20 16:16:31.961193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.191 #33 NEW cov: 11750 ft: 14597 corp: 25/500b lim: 40 exec/s: 33 rss: 68Mb L: 24/36 MS: 1 ChangeByte- 00:08:03.449 [2024-07-20 16:16:32.001206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff28ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.001234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.001295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff25 cdw11:2a252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.001309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.001360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.001374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.449 #34 NEW cov: 11750 ft: 14622 corp: 26/525b lim: 40 exec/s: 34 rss: 68Mb L: 25/36 MS: 1 ChangeASCIIInt- 00:08:03.449 [2024-07-20 16:16:32.041103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff28ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.041127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.041180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.041194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.449 #35 NEW cov: 11750 ft: 14650 corp: 27/547b lim: 40 exec/s: 35 rss: 68Mb L: 22/36 MS: 1 InsertByte- 00:08:03.449 [2024-07-20 16:16:32.081379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffe cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.081404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.081458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.081472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.081526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.081539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.449 #41 NEW cov: 11750 ft: 14666 corp: 28/572b lim: 40 exec/s: 41 rss: 68Mb L: 25/36 MS: 1 InsertRepeatedBytes- 00:08:03.449 [2024-07-20 16:16:32.121328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.121353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.121406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.121419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.449 #42 NEW cov: 11750 ft: 14674 corp: 29/593b lim: 40 exec/s: 42 rss: 68Mb L: 21/36 MS: 1 ChangeASCIIInt- 00:08:03.449 [2024-07-20 16:16:32.161475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:caffffff cdw11:feffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.161502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.161554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff0affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.161571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.449 #43 NEW cov: 11750 ft: 14675 corp: 30/609b lim: 40 exec/s: 43 rss: 68Mb L: 16/36 MS: 1 CrossOver- 00:08:03.449 [2024-07-20 16:16:32.202027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.202052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.202105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.202118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.202170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000000b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.202183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.202237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:b9b9b900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.202250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.202302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.202315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.449 #44 NEW cov: 11750 ft: 14750 corp: 31/649b lim: 40 exec/s: 44 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:03.449 [2024-07-20 16:16:32.241839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.241864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.241918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff4025 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.241932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.449 [2024-07-20 16:16:32.241983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:252525ff cdw11:ffffff31 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.449 [2024-07-20 16:16:32.241997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.708 #45 NEW cov: 11750 ft: 14760 corp: 32/673b lim: 40 exec/s: 45 rss: 68Mb L: 24/40 MS: 1 ChangeByte- 00:08:03.708 [2024-07-20 16:16:32.281963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.281990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.282058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffdd9da cdw11:dadadada SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.282072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.282122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:da2525ff cdw11:ff1eff31 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.282139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.708 #46 NEW cov: 11750 ft: 14763 corp: 33/697b lim: 40 exec/s: 46 rss: 68Mb L: 24/40 MS: 1 ChangeBinInt- 00:08:03.708 [2024-07-20 16:16:32.321927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.321952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.322002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.322016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.708 #47 NEW cov: 11750 ft: 14765 corp: 34/718b lim: 40 exec/s: 47 rss: 68Mb L: 21/40 MS: 1 ChangeASCIIInt- 00:08:03.708 [2024-07-20 16:16:32.362083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.362109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.362162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:252525ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.362176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.708 #48 NEW cov: 11750 ft: 14771 corp: 35/739b lim: 40 exec/s: 48 rss: 68Mb L: 21/40 MS: 1 ShuffleBytes- 00:08:03.708 [2024-07-20 16:16:32.402194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:002ef269 cdw11:47dfc134 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.402220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.402274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffffffe cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.402287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.708 #49 NEW cov: 11750 ft: 14802 corp: 36/758b lim: 40 exec/s: 49 rss: 68Mb L: 19/40 MS: 1 CMP- DE: "\000.\362iG\337\3014"- 00:08:03.708 [2024-07-20 16:16:32.442698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.442723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.442777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.442790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.442841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.442854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.442904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.442917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.442977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.442990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.708 #50 NEW cov: 11750 ft: 14813 corp: 37/798b lim: 40 exec/s: 50 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:03.708 [2024-07-20 16:16:32.482405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.482430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.708 [2024-07-20 16:16:32.482490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff8c8c8c cdw11:8cffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.708 [2024-07-20 16:16:32.482503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.708 #51 NEW cov: 11750 ft: 14824 corp: 38/816b lim: 40 exec/s: 51 rss: 69Mb L: 18/40 MS: 1 InsertRepeatedBytes- 00:08:03.967 [2024-07-20 16:16:32.522781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:36363636 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.522805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.522862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36363636 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.522876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.522930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:363636ff cdw11:ffff2525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.522943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.522994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:252525ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.523007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.967 #52 NEW cov: 11750 ft: 14860 corp: 39/852b lim: 40 exec/s: 52 rss: 69Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:08:03.967 [2024-07-20 16:16:32.562477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:002ef269 cdw11:47dfc134 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.562502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 #57 NEW cov: 11750 ft: 14873 corp: 40/862b lim: 40 exec/s: 57 rss: 69Mb L: 10/40 MS: 5 ChangeByte-CopyPart-CopyPart-ChangeBit-PersAutoDict- DE: "\000.\362iG\337\3014"- 00:08:03.967 [2024-07-20 16:16:32.592751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffefffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.592776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.592831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:252525ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.592844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.967 #58 NEW cov: 11750 ft: 14877 corp: 41/883b lim: 40 exec/s: 58 rss: 69Mb L: 21/40 MS: 1 ChangeBit- 00:08:03.967 [2024-07-20 16:16:32.632835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.632860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.632914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff8c7373 cdw11:738c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.632927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.967 #59 NEW cov: 11750 ft: 14881 corp: 42/904b lim: 40 exec/s: 59 rss: 69Mb L: 21/40 MS: 1 InsertRepeatedBytes- 00:08:03.967 [2024-07-20 16:16:32.673237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff28ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.673262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.673319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff25 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.673332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.673401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.673415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.673472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:31464646 cdw11:46464646 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.673485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.967 #60 NEW cov: 11750 ft: 14883 corp: 43/936b lim: 40 exec/s: 60 rss: 69Mb L: 32/40 MS: 1 InsertRepeatedBytes- 00:08:03.967 [2024-07-20 16:16:32.713086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffefffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.713111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.713163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:252525ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.713176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.967 #61 NEW cov: 11750 ft: 14891 corp: 44/957b lim: 40 exec/s: 61 rss: 69Mb L: 21/40 MS: 1 ShuffleBytes- 00:08:03.967 [2024-07-20 16:16:32.753398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffe cdw11:ffffffd2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.753423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.753476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.753490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.967 [2024-07-20 16:16:32.753544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.967 [2024-07-20 16:16:32.753557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.227 #62 NEW cov: 11750 ft: 14895 corp: 45/982b lim: 40 exec/s: 31 rss: 69Mb L: 25/40 MS: 1 ChangeByte- 00:08:04.227 #62 DONE cov: 11750 ft: 14895 corp: 45/982b lim: 40 exec/s: 31 rss: 69Mb 00:08:04.227 ###### Recommended dictionary. ###### 00:08:04.227 "\000.\362iG\337\3014" # Uses: 1 00:08:04.227 ###### End of recommended dictionary. ###### 00:08:04.227 Done 62 runs in 2 second(s) 00:08:04.227 16:16:32 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:04.227 16:16:32 -- ../common.sh@72 -- # (( i++ )) 00:08:04.227 16:16:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.227 16:16:32 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:04.227 16:16:32 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:04.227 16:16:32 -- nvmf/run.sh@24 -- # local timen=1 00:08:04.227 16:16:32 -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.227 16:16:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:04.227 16:16:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:04.227 16:16:32 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:04.227 16:16:32 -- nvmf/run.sh@29 -- # port=4412 00:08:04.227 16:16:32 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:04.227 16:16:32 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:04.227 16:16:32 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.227 16:16:32 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:04.227 [2024-07-20 16:16:32.937010] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:04.227 [2024-07-20 16:16:32.937081] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2274693 ] 00:08:04.227 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.485 [2024-07-20 16:16:33.113885] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.485 [2024-07-20 16:16:33.133384] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:04.485 [2024-07-20 16:16:33.133532] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.485 [2024-07-20 16:16:33.184855] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.485 [2024-07-20 16:16:33.201189] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:04.485 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.485 INFO: Seed: 50029009 00:08:04.485 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:04.485 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:04.485 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:04.485 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.485 #2 INITED exec/s: 0 rss: 60Mb 00:08:04.485 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.485 This may also happen if the target rejected all inputs we tried so far 00:08:04.485 [2024-07-20 16:16:33.267834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.485 [2024-07-20 16:16:33.267872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.485 [2024-07-20 16:16:33.268009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.485 [2024-07-20 16:16:33.268029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.485 [2024-07-20 16:16:33.268166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.485 [2024-07-20 16:16:33.268184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.001 NEW_FUNC[1/671]: 0x4a2100 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:05.001 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.001 #12 NEW cov: 11517 ft: 11518 corp: 2/31b lim: 40 exec/s: 0 rss: 66Mb L: 30/30 MS: 5 CopyPart-InsertByte-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:05.001 [2024-07-20 16:16:33.608582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.608622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.608764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.608783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.608917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.608946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.001 #13 NEW cov: 11634 ft: 12058 corp: 3/61b lim: 40 exec/s: 0 rss: 66Mb L: 30/30 MS: 1 ChangeByte- 00:08:05.001 [2024-07-20 16:16:33.658693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.658726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.658844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.658862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.658977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000007b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.658995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.001 #14 NEW cov: 11640 ft: 12371 corp: 4/92b lim: 40 exec/s: 0 rss: 66Mb L: 31/31 MS: 1 InsertByte- 00:08:05.001 [2024-07-20 16:16:33.698101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.698127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.698254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.698272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.001 #16 NEW cov: 11725 ft: 12860 corp: 5/111b lim: 40 exec/s: 0 rss: 66Mb L: 19/31 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:05.001 [2024-07-20 16:16:33.738835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.738867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.739006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.739024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.739144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000007b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.739161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.001 #17 NEW cov: 11725 ft: 13031 corp: 6/142b lim: 40 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 ShuffleBytes- 00:08:05.001 [2024-07-20 16:16:33.778587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.778614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.778753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.778769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.001 [2024-07-20 16:16:33.778900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000feff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.001 [2024-07-20 16:16:33.778918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.001 #18 NEW cov: 11725 ft: 13077 corp: 7/172b lim: 40 exec/s: 0 rss: 67Mb L: 30/31 MS: 1 CMP- DE: "\376\377\377\377\000\000\000\000"- 00:08:05.259 [2024-07-20 16:16:33.819090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.819119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.259 [2024-07-20 16:16:33.819255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.819273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.259 [2024-07-20 16:16:33.819397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000007b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.819414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.259 #19 NEW cov: 11725 ft: 13160 corp: 8/203b lim: 40 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 ShuffleBytes- 00:08:05.259 [2024-07-20 16:16:33.869264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.869291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.259 [2024-07-20 16:16:33.869415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000007b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.869431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.259 [2024-07-20 16:16:33.869561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.869592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.259 #20 NEW cov: 11725 ft: 13201 corp: 9/230b lim: 40 exec/s: 0 rss: 67Mb L: 27/31 MS: 1 EraseBytes- 00:08:05.259 [2024-07-20 16:16:33.909082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.909110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.259 [2024-07-20 16:16:33.909240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.909256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.259 #21 NEW cov: 11725 ft: 13306 corp: 10/249b lim: 40 exec/s: 0 rss: 67Mb L: 19/31 MS: 1 ChangeBit- 00:08:05.259 [2024-07-20 16:16:33.949491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.949519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.259 [2024-07-20 16:16:33.949647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.949664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.259 [2024-07-20 16:16:33.949793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00002900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.949810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.259 #22 NEW cov: 11725 ft: 13392 corp: 11/280b lim: 40 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 InsertByte- 00:08:05.259 [2024-07-20 16:16:33.989339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.989368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.259 [2024-07-20 16:16:33.989501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:33.989519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.259 #23 NEW cov: 11725 ft: 13433 corp: 12/299b lim: 40 exec/s: 0 rss: 67Mb L: 19/31 MS: 1 ShuffleBytes- 00:08:05.259 [2024-07-20 16:16:34.029260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7429feff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.259 [2024-07-20 16:16:34.029287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.259 #27 NEW cov: 11725 ft: 14158 corp: 13/309b lim: 40 exec/s: 0 rss: 67Mb L: 10/31 MS: 4 ChangeBinInt-ChangeBit-InsertByte-PersAutoDict- DE: "\376\377\377\377\000\000\000\000"- 00:08:05.517 [2024-07-20 16:16:34.069366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7b0a0000 cdw11:00002b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.517 [2024-07-20 16:16:34.069394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.517 #32 NEW cov: 11725 ft: 14185 corp: 14/317b lim: 40 exec/s: 0 rss: 67Mb L: 8/31 MS: 5 InsertByte-ChangeByte-EraseBytes-CrossOver-CrossOver- 00:08:05.517 [2024-07-20 16:16:34.110063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.517 [2024-07-20 16:16:34.110095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.517 [2024-07-20 16:16:34.110225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.517 [2024-07-20 16:16:34.110242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.110370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00002900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.110387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.518 #33 NEW cov: 11725 ft: 14213 corp: 15/348b lim: 40 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 ChangeBit- 00:08:05.518 [2024-07-20 16:16:34.150167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.150195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.150312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.150329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.150450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.150465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.518 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:05.518 #34 NEW cov: 11748 ft: 14251 corp: 16/378b lim: 40 exec/s: 0 rss: 68Mb L: 30/31 MS: 1 ChangeBit- 00:08:05.518 [2024-07-20 16:16:34.190577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.190606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.190730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.190747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.190837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:29000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.190855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.190983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.191000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.518 #35 NEW cov: 11748 ft: 14549 corp: 17/411b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 CMP- DE: "\001\000"- 00:08:05.518 [2024-07-20 16:16:34.230580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.230608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.230739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.230756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.230873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000106 cdw11:29000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.230891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.231017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.231032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.518 #36 NEW cov: 11748 ft: 14580 corp: 18/444b lim: 40 exec/s: 36 rss: 68Mb L: 33/33 MS: 1 ChangeByte- 00:08:05.518 [2024-07-20 16:16:34.270710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.270737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.270870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.270887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.271018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.271035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.271158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.271175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.518 #37 NEW cov: 11748 ft: 14591 corp: 19/477b lim: 40 exec/s: 37 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:05.518 [2024-07-20 16:16:34.310848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.310875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.310998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00004500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.311015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.311141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.311158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.518 [2024-07-20 16:16:34.311288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.518 [2024-07-20 16:16:34.311304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.776 #38 NEW cov: 11748 ft: 14614 corp: 20/511b lim: 40 exec/s: 38 rss: 68Mb L: 34/34 MS: 1 InsertByte- 00:08:05.776 [2024-07-20 16:16:34.350128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7b290000 cdw11:00002b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.776 [2024-07-20 16:16:34.350155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.776 #39 NEW cov: 11748 ft: 14697 corp: 21/519b lim: 40 exec/s: 39 rss: 68Mb L: 8/34 MS: 1 ChangeByte- 00:08:05.776 [2024-07-20 16:16:34.390481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.776 [2024-07-20 16:16:34.390510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.776 [2024-07-20 16:16:34.390634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.776 [2024-07-20 16:16:34.390651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.776 #40 NEW cov: 11748 ft: 14710 corp: 22/540b lim: 40 exec/s: 40 rss: 68Mb L: 21/34 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:05.776 [2024-07-20 16:16:34.431258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.776 [2024-07-20 16:16:34.431287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.776 [2024-07-20 16:16:34.431423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.776 [2024-07-20 16:16:34.431440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.776 [2024-07-20 16:16:34.431583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00290000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.776 [2024-07-20 16:16:34.431600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.431721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00007b0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.431737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.777 #46 NEW cov: 11748 ft: 14727 corp: 23/574b lim: 40 exec/s: 46 rss: 68Mb L: 34/34 MS: 1 CrossOver- 00:08:05.777 [2024-07-20 16:16:34.471322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.471351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.471470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.471489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.471618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.471634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.471751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.471785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.777 #47 NEW cov: 11748 ft: 14748 corp: 24/608b lim: 40 exec/s: 47 rss: 68Mb L: 34/34 MS: 1 InsertByte- 00:08:05.777 [2024-07-20 16:16:34.511224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000fe cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.511254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.511380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.511399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.511525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.511542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.777 #48 NEW cov: 11748 ft: 14766 corp: 25/635b lim: 40 exec/s: 48 rss: 68Mb L: 27/34 MS: 1 PersAutoDict- DE: "\376\377\377\377\000\000\000\000"- 00:08:05.777 [2024-07-20 16:16:34.551608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.551638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.551750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00002b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.551768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.551885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:29000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.551903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.777 [2024-07-20 16:16:34.552024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.777 [2024-07-20 16:16:34.552042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.777 #49 NEW cov: 11748 ft: 14797 corp: 26/668b lim: 40 exec/s: 49 rss: 68Mb L: 33/34 MS: 1 ShuffleBytes- 00:08:06.035 [2024-07-20 16:16:34.591225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.591253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.035 [2024-07-20 16:16:34.591378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.591396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.035 #50 NEW cov: 11748 ft: 14820 corp: 27/687b lim: 40 exec/s: 50 rss: 68Mb L: 19/34 MS: 1 CrossOver- 00:08:06.035 [2024-07-20 16:16:34.641604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.641634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.035 [2024-07-20 16:16:34.641765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:10000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.641784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.035 [2024-07-20 16:16:34.641907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000007b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.641925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.035 #51 NEW cov: 11748 ft: 14860 corp: 28/718b lim: 40 exec/s: 51 rss: 68Mb L: 31/34 MS: 1 ChangeBit- 00:08:06.035 [2024-07-20 16:16:34.691170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:feffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.691198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.035 #53 NEW cov: 11748 ft: 14939 corp: 29/727b lim: 40 exec/s: 53 rss: 68Mb L: 9/34 MS: 2 ChangeBit-PersAutoDict- DE: "\376\377\377\377\000\000\000\000"- 00:08:06.035 [2024-07-20 16:16:34.731563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.731590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.035 [2024-07-20 16:16:34.731726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.731745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.035 #54 NEW cov: 11748 ft: 14946 corp: 30/747b lim: 40 exec/s: 54 rss: 68Mb L: 20/34 MS: 1 EraseBytes- 00:08:06.035 [2024-07-20 16:16:34.782077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000fe cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.782107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.035 [2024-07-20 16:16:34.782236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.782251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.035 [2024-07-20 16:16:34.782368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.782387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.035 [2024-07-20 16:16:34.782508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ff004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.782524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.035 #55 NEW cov: 11748 ft: 14986 corp: 31/783b lim: 40 exec/s: 55 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:06.035 [2024-07-20 16:16:34.831691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a3d0000 cdw11:00002b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.035 [2024-07-20 16:16:34.831720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.036 [2024-07-20 16:16:34.831838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.036 [2024-07-20 16:16:34.831858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.036 [2024-07-20 16:16:34.831980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:7b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.036 [2024-07-20 16:16:34.831997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.295 #56 NEW cov: 11748 ft: 14989 corp: 32/811b lim: 40 exec/s: 56 rss: 68Mb L: 28/36 MS: 1 InsertByte- 00:08:06.295 [2024-07-20 16:16:34.892641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.892670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:34.892790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.892807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:34.892930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:29000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.892948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:34.893080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000000af SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.893100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.295 #57 NEW cov: 11748 ft: 15080 corp: 33/845b lim: 40 exec/s: 57 rss: 68Mb L: 34/36 MS: 1 InsertByte- 00:08:06.295 [2024-07-20 16:16:34.942226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.942255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:34.942402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.942419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.295 #58 NEW cov: 11748 ft: 15147 corp: 34/866b lim: 40 exec/s: 58 rss: 68Mb L: 21/36 MS: 1 ChangeByte- 00:08:06.295 [2024-07-20 16:16:34.992636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.992665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:34.992803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:10000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.992819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:34.992942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:7b00007b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:34.992959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.295 #59 NEW cov: 11748 ft: 15163 corp: 35/897b lim: 40 exec/s: 59 rss: 69Mb L: 31/36 MS: 1 CopyPart- 00:08:06.295 [2024-07-20 16:16:35.043039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b7e00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:35.043070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:35.043190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:35.043206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:35.043332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:7b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:35.043350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:35.043489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000a42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:35.043505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.295 #60 NEW cov: 11748 ft: 15184 corp: 36/929b lim: 40 exec/s: 60 rss: 69Mb L: 32/36 MS: 1 InsertByte- 00:08:06.295 [2024-07-20 16:16:35.082426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:35.082458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:35.082586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:35.082602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.295 [2024-07-20 16:16:35.082732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000d800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.295 [2024-07-20 16:16:35.082747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.554 #61 NEW cov: 11748 ft: 15215 corp: 37/960b lim: 40 exec/s: 61 rss: 69Mb L: 31/36 MS: 1 ChangeBinInt- 00:08:06.555 [2024-07-20 16:16:35.122764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.122792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.122921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.122938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.123076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00290000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.123092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.123219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00007b32 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.123237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.555 #62 NEW cov: 11748 ft: 15218 corp: 38/995b lim: 40 exec/s: 62 rss: 69Mb L: 35/36 MS: 1 InsertByte- 00:08:06.555 [2024-07-20 16:16:35.163329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.163361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.163487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.163504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.163628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00290000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.163644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.163762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:002b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.163779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.555 #63 NEW cov: 11748 ft: 15231 corp: 39/1030b lim: 40 exec/s: 63 rss: 69Mb L: 35/36 MS: 1 CrossOver- 00:08:06.555 [2024-07-20 16:16:35.202854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a3d0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.202881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.203017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:007b0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.203035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.555 #64 NEW cov: 11748 ft: 15237 corp: 40/1051b lim: 40 exec/s: 64 rss: 69Mb L: 21/36 MS: 1 EraseBytes- 00:08:06.555 [2024-07-20 16:16:35.243270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:002b0e00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.243298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.243426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.243454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.555 [2024-07-20 16:16:35.243582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000007b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.555 [2024-07-20 16:16:35.243597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.555 #65 NEW cov: 11748 ft: 15248 corp: 41/1082b lim: 40 exec/s: 32 rss: 69Mb L: 31/36 MS: 1 CMP- DE: "\016\000\000\000\000\000\000\000"- 00:08:06.555 #65 DONE cov: 11748 ft: 15248 corp: 41/1082b lim: 40 exec/s: 32 rss: 69Mb 00:08:06.555 ###### Recommended dictionary. ###### 00:08:06.555 "\376\377\377\377\000\000\000\000" # Uses: 3 00:08:06.555 "\001\000" # Uses: 1 00:08:06.555 "\016\000\000\000\000\000\000\000" # Uses: 0 00:08:06.555 ###### End of recommended dictionary. ###### 00:08:06.555 Done 65 runs in 2 second(s) 00:08:06.814 16:16:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:06.814 16:16:35 -- ../common.sh@72 -- # (( i++ )) 00:08:06.814 16:16:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:06.814 16:16:35 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:06.814 16:16:35 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:06.814 16:16:35 -- nvmf/run.sh@24 -- # local timen=1 00:08:06.814 16:16:35 -- nvmf/run.sh@25 -- # local core=0x1 00:08:06.814 16:16:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:06.814 16:16:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:06.814 16:16:35 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:06.814 16:16:35 -- nvmf/run.sh@29 -- # port=4413 00:08:06.814 16:16:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:06.814 16:16:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:06.814 16:16:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:06.814 16:16:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:06.814 [2024-07-20 16:16:35.421068] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:06.814 [2024-07-20 16:16:35.421151] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2275230 ] 00:08:06.814 EAL: No free 2048 kB hugepages reported on node 1 00:08:06.814 [2024-07-20 16:16:35.598120] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.814 [2024-07-20 16:16:35.617851] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:06.814 [2024-07-20 16:16:35.617981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.073 [2024-07-20 16:16:35.669402] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.073 [2024-07-20 16:16:35.685695] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:07.073 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.073 INFO: Seed: 2533026445 00:08:07.073 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:07.073 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:07.073 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:07.073 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.073 #2 INITED exec/s: 0 rss: 60Mb 00:08:07.073 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.073 This may also happen if the target rejected all inputs we tried so far 00:08:07.073 [2024-07-20 16:16:35.734301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.073 [2024-07-20 16:16:35.734335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.332 NEW_FUNC[1/670]: 0x4a3cc0 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:07.332 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:07.332 #3 NEW cov: 11509 ft: 11505 corp: 2/10b lim: 40 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:07.332 [2024-07-20 16:16:36.086513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaea6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.332 [2024-07-20 16:16:36.086560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.332 #4 NEW cov: 11622 ft: 12057 corp: 3/19b lim: 40 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 ChangeBit- 00:08:07.591 [2024-07-20 16:16:36.136522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.591 [2024-07-20 16:16:36.136555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.591 #5 NEW cov: 11628 ft: 12415 corp: 4/31b lim: 40 exec/s: 0 rss: 66Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:08:07.591 [2024-07-20 16:16:36.176623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.591 [2024-07-20 16:16:36.176652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.591 #6 NEW cov: 11713 ft: 12644 corp: 5/44b lim: 40 exec/s: 0 rss: 66Mb L: 13/13 MS: 1 CopyPart- 00:08:07.591 [2024-07-20 16:16:36.216797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae23 cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.591 [2024-07-20 16:16:36.216827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.591 #7 NEW cov: 11713 ft: 12727 corp: 6/54b lim: 40 exec/s: 0 rss: 66Mb L: 10/13 MS: 1 InsertByte- 00:08:07.591 [2024-07-20 16:16:36.256816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff0a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.591 [2024-07-20 16:16:36.256846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.591 #8 NEW cov: 11713 ft: 12794 corp: 7/67b lim: 40 exec/s: 0 rss: 66Mb L: 13/13 MS: 1 CrossOver- 00:08:07.591 [2024-07-20 16:16:36.296910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae23 cdw11:aeaeae0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.591 [2024-07-20 16:16:36.296938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.591 #9 NEW cov: 11713 ft: 12963 corp: 8/77b lim: 40 exec/s: 0 rss: 67Mb L: 10/13 MS: 1 ChangeBinInt- 00:08:07.591 [2024-07-20 16:16:36.337107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff0a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.591 [2024-07-20 16:16:36.337135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.591 #10 NEW cov: 11713 ft: 13022 corp: 9/90b lim: 40 exec/s: 0 rss: 67Mb L: 13/13 MS: 1 ShuffleBytes- 00:08:07.591 [2024-07-20 16:16:36.377017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae3f cdw11:23aeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.591 [2024-07-20 16:16:36.377046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.850 #11 NEW cov: 11713 ft: 13077 corp: 10/101b lim: 40 exec/s: 0 rss: 67Mb L: 11/13 MS: 1 InsertByte- 00:08:07.850 [2024-07-20 16:16:36.417534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.417562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.850 [2024-07-20 16:16:36.417698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.417716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.850 #13 NEW cov: 11713 ft: 13469 corp: 11/122b lim: 40 exec/s: 0 rss: 67Mb L: 21/21 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:07.850 [2024-07-20 16:16:36.457665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:61616161 cdw11:61616161 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.457695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.850 [2024-07-20 16:16:36.457825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:616161ae cdw11:8eaea60a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.457842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.850 #16 NEW cov: 11713 ft: 13539 corp: 12/138b lim: 40 exec/s: 0 rss: 67Mb L: 16/21 MS: 3 EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:07.850 [2024-07-20 16:16:36.497870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff0a cdw11:86868686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.497898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.850 [2024-07-20 16:16:36.498030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:8686ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.498050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.850 #17 NEW cov: 11713 ft: 13577 corp: 13/161b lim: 40 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:07.850 [2024-07-20 16:16:36.537813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff0a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.537840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.850 #18 NEW cov: 11713 ft: 13593 corp: 14/174b lim: 40 exec/s: 0 rss: 67Mb L: 13/23 MS: 1 ChangeBinInt- 00:08:07.850 [2024-07-20 16:16:36.577811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae23 cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.577837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.850 #19 NEW cov: 11713 ft: 13641 corp: 15/186b lim: 40 exec/s: 0 rss: 67Mb L: 12/23 MS: 1 CopyPart- 00:08:07.850 [2024-07-20 16:16:36.608118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.608147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.850 [2024-07-20 16:16:36.608265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff03 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.608293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.850 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:07.850 #20 NEW cov: 11736 ft: 13695 corp: 16/203b lim: 40 exec/s: 0 rss: 67Mb L: 17/23 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\003"- 00:08:07.850 [2024-07-20 16:16:36.648059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae3f cdw11:ae23aeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.850 [2024-07-20 16:16:36.648086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.110 #21 NEW cov: 11736 ft: 13735 corp: 17/214b lim: 40 exec/s: 0 rss: 67Mb L: 11/23 MS: 1 ShuffleBytes- 00:08:08.110 [2024-07-20 16:16:36.688212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeae2623 cdw11:aeaeae0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.110 [2024-07-20 16:16:36.688239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.110 #22 NEW cov: 11736 ft: 13794 corp: 18/224b lim: 40 exec/s: 0 rss: 67Mb L: 10/23 MS: 1 ChangeByte- 00:08:08.110 [2024-07-20 16:16:36.728395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fffffd0a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.110 [2024-07-20 16:16:36.728427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.110 #23 NEW cov: 11736 ft: 13848 corp: 19/237b lim: 40 exec/s: 23 rss: 67Mb L: 13/23 MS: 1 ChangeBit- 00:08:08.110 [2024-07-20 16:16:36.768506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff41ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.110 [2024-07-20 16:16:36.768533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.110 #24 NEW cov: 11736 ft: 13861 corp: 20/249b lim: 40 exec/s: 24 rss: 67Mb L: 12/23 MS: 1 ChangeByte- 00:08:08.110 [2024-07-20 16:16:36.808548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.110 [2024-07-20 16:16:36.808576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.110 #30 NEW cov: 11736 ft: 13936 corp: 21/261b lim: 40 exec/s: 30 rss: 67Mb L: 12/23 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\003"- 00:08:08.110 [2024-07-20 16:16:36.848759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae3f cdw11:ae23aeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.110 [2024-07-20 16:16:36.848787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.110 #31 NEW cov: 11736 ft: 13999 corp: 22/272b lim: 40 exec/s: 31 rss: 68Mb L: 11/23 MS: 1 ChangeByte- 00:08:08.110 [2024-07-20 16:16:36.888891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:61616161 cdw11:61616161 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.110 [2024-07-20 16:16:36.888919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.369 #32 NEW cov: 11736 ft: 14025 corp: 23/285b lim: 40 exec/s: 32 rss: 68Mb L: 13/23 MS: 1 EraseBytes- 00:08:08.369 [2024-07-20 16:16:36.928990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae3f cdw11:ae23aeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.369 [2024-07-20 16:16:36.929018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.369 #33 NEW cov: 11736 ft: 14034 corp: 24/296b lim: 40 exec/s: 33 rss: 68Mb L: 11/23 MS: 1 CopyPart- 00:08:08.369 [2024-07-20 16:16:36.969050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.369 [2024-07-20 16:16:36.969078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.369 #34 NEW cov: 11736 ft: 14103 corp: 25/309b lim: 40 exec/s: 34 rss: 68Mb L: 13/23 MS: 1 ChangeBinInt- 00:08:08.369 [2024-07-20 16:16:37.009375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff0a cdw11:86868686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.369 [2024-07-20 16:16:37.009402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.369 [2024-07-20 16:16:37.009537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:8686ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.369 [2024-07-20 16:16:37.009555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.369 #35 NEW cov: 11736 ft: 14104 corp: 26/332b lim: 40 exec/s: 35 rss: 68Mb L: 23/23 MS: 1 ChangeByte- 00:08:08.369 [2024-07-20 16:16:37.049617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.369 [2024-07-20 16:16:37.049650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.369 [2024-07-20 16:16:37.049775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.370 [2024-07-20 16:16:37.049791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.370 #36 NEW cov: 11736 ft: 14145 corp: 27/349b lim: 40 exec/s: 36 rss: 68Mb L: 17/23 MS: 1 CrossOver- 00:08:08.370 [2024-07-20 16:16:37.089668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:61010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.370 [2024-07-20 16:16:37.089695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.370 [2024-07-20 16:16:37.089820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:006161ae cdw11:8eaea60a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.370 [2024-07-20 16:16:37.089837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.370 #37 NEW cov: 11736 ft: 14159 corp: 28/365b lim: 40 exec/s: 37 rss: 68Mb L: 16/23 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:08.370 [2024-07-20 16:16:37.129595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:6161aeae cdw11:23aeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.370 [2024-07-20 16:16:37.129623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.370 #38 NEW cov: 11736 ft: 14183 corp: 29/378b lim: 40 exec/s: 38 rss: 68Mb L: 13/23 MS: 1 CrossOver- 00:08:08.370 [2024-07-20 16:16:37.170035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff41ff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.370 [2024-07-20 16:16:37.170063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.370 [2024-07-20 16:16:37.170181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.370 [2024-07-20 16:16:37.170197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.629 #39 NEW cov: 11736 ft: 14194 corp: 30/397b lim: 40 exec/s: 39 rss: 68Mb L: 19/23 MS: 1 InsertRepeatedBytes- 00:08:08.629 [2024-07-20 16:16:37.209839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae01 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.629 [2024-07-20 16:16:37.209866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.630 #40 NEW cov: 11736 ft: 14210 corp: 31/408b lim: 40 exec/s: 40 rss: 68Mb L: 11/23 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:08.630 [2024-07-20 16:16:37.249927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:6161aeae cdw11:23aaaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.630 [2024-07-20 16:16:37.249956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.630 #41 NEW cov: 11736 ft: 14230 corp: 32/421b lim: 40 exec/s: 41 rss: 68Mb L: 13/23 MS: 1 ChangeBit- 00:08:08.630 [2024-07-20 16:16:37.290073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeae23 cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.630 [2024-07-20 16:16:37.290101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.630 #42 NEW cov: 11736 ft: 14267 corp: 33/433b lim: 40 exec/s: 42 rss: 68Mb L: 12/23 MS: 1 ChangeBinInt- 00:08:08.630 [2024-07-20 16:16:37.330445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:61616161 cdw11:61616161 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.630 [2024-07-20 16:16:37.330473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.630 [2024-07-20 16:16:37.330584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:616161ae cdw11:8eaea60a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.630 [2024-07-20 16:16:37.330600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.630 #43 NEW cov: 11736 ft: 14272 corp: 34/449b lim: 40 exec/s: 43 rss: 69Mb L: 16/23 MS: 1 CopyPart- 00:08:08.630 [2024-07-20 16:16:37.370518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:616161a6 cdw11:9e9e9e9e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.630 [2024-07-20 16:16:37.370546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.630 [2024-07-20 16:16:37.370672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:9e9e9eae cdw11:8eaea60a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.630 [2024-07-20 16:16:37.370688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.630 #44 NEW cov: 11736 ft: 14281 corp: 35/465b lim: 40 exec/s: 44 rss: 69Mb L: 16/23 MS: 1 ChangeBinInt- 00:08:08.630 [2024-07-20 16:16:37.410686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.630 [2024-07-20 16:16:37.410713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.630 [2024-07-20 16:16:37.410853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffaeaeff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.630 [2024-07-20 16:16:37.410871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.889 [2024-07-20 16:16:37.450805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.450834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.889 [2024-07-20 16:16:37.450970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffaeaeff cdw11:ff7fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.450990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.889 #46 NEW cov: 11736 ft: 14283 corp: 36/482b lim: 40 exec/s: 46 rss: 69Mb L: 17/23 MS: 2 CrossOver-ChangeBit- 00:08:08.889 [2024-07-20 16:16:37.490725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaea6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.490755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.889 #47 NEW cov: 11736 ft: 14331 corp: 37/491b lim: 40 exec/s: 47 rss: 69Mb L: 9/23 MS: 1 CopyPart- 00:08:08.889 [2024-07-20 16:16:37.530871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeae23ae cdw11:aeaeae0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.530899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.889 #48 NEW cov: 11736 ft: 14333 corp: 38/501b lim: 40 exec/s: 48 rss: 69Mb L: 10/23 MS: 1 ShuffleBytes- 00:08:08.889 [2024-07-20 16:16:37.570852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.570879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.889 [2024-07-20 16:16:37.570996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:03aeaeae cdw11:aeaeaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.571012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.889 #49 NEW cov: 11736 ft: 14358 corp: 39/522b lim: 40 exec/s: 49 rss: 69Mb L: 21/23 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\003"- 00:08:08.889 [2024-07-20 16:16:37.621310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.621339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.889 [2024-07-20 16:16:37.621460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.621478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.889 #50 NEW cov: 11736 ft: 14361 corp: 40/543b lim: 40 exec/s: 50 rss: 69Mb L: 21/23 MS: 1 ChangeBit- 00:08:08.889 [2024-07-20 16:16:37.681289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.889 [2024-07-20 16:16:37.681320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.149 #51 NEW cov: 11736 ft: 14430 corp: 41/553b lim: 40 exec/s: 51 rss: 69Mb L: 10/23 MS: 1 EraseBytes- 00:08:09.149 [2024-07-20 16:16:37.721232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:aeaeaeff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.149 [2024-07-20 16:16:37.721261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.149 [2024-07-20 16:16:37.721402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.149 [2024-07-20 16:16:37.721420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.149 #52 NEW cov: 11736 ft: 14441 corp: 42/574b lim: 40 exec/s: 26 rss: 69Mb L: 21/23 MS: 1 CrossOver- 00:08:09.149 #52 DONE cov: 11736 ft: 14441 corp: 42/574b lim: 40 exec/s: 26 rss: 69Mb 00:08:09.149 ###### Recommended dictionary. ###### 00:08:09.149 "\377\377\377\377\377\377\377\003" # Uses: 2 00:08:09.149 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:09.149 ###### End of recommended dictionary. ###### 00:08:09.149 Done 52 runs in 2 second(s) 00:08:09.149 16:16:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:09.149 16:16:37 -- ../common.sh@72 -- # (( i++ )) 00:08:09.149 16:16:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.149 16:16:37 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:09.149 16:16:37 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:09.149 16:16:37 -- nvmf/run.sh@24 -- # local timen=1 00:08:09.149 16:16:37 -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.149 16:16:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:09.149 16:16:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:09.149 16:16:37 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:09.149 16:16:37 -- nvmf/run.sh@29 -- # port=4414 00:08:09.149 16:16:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:09.149 16:16:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:09.149 16:16:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.149 16:16:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:09.149 [2024-07-20 16:16:37.896098] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:09.149 [2024-07-20 16:16:37.896168] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2275712 ] 00:08:09.149 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.409 [2024-07-20 16:16:38.074139] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.409 [2024-07-20 16:16:38.093739] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:09.409 [2024-07-20 16:16:38.093864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.409 [2024-07-20 16:16:38.145373] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:09.409 [2024-07-20 16:16:38.161689] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:09.409 INFO: Running with entropic power schedule (0xFF, 100). 00:08:09.409 INFO: Seed: 714059030 00:08:09.409 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:09.409 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:09.409 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:09.409 INFO: A corpus is not provided, starting from an empty corpus 00:08:09.409 #2 INITED exec/s: 0 rss: 60Mb 00:08:09.409 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:09.409 This may also happen if the target rejected all inputs we tried so far 00:08:09.944 NEW_FUNC[1/658]: 0x4a5880 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:09.944 NEW_FUNC[2/658]: 0x4c6c20 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:09.944 #18 NEW cov: 11401 ft: 11402 corp: 2/10b lim: 35 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000p"- 00:08:09.944 [2024-07-20 16:16:38.521907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.944 [2024-07-20 16:16:38.521944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.944 NEW_FUNC[1/15]: 0x16c61d0 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:08:09.944 NEW_FUNC[2/15]: 0x16c6410 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:08:09.944 #23 NEW cov: 11649 ft: 12551 corp: 3/24b lim: 35 exec/s: 0 rss: 66Mb L: 14/14 MS: 5 EraseBytes-ChangeBit-ChangeBinInt-ChangeBinInt-CrossOver- 00:08:09.944 [2024-07-20 16:16:38.571936] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.944 [2024-07-20 16:16:38.571964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.944 #24 NEW cov: 11655 ft: 12747 corp: 4/38b lim: 35 exec/s: 0 rss: 66Mb L: 14/14 MS: 1 CrossOver- 00:08:09.944 [2024-07-20 16:16:38.612014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.944 [2024-07-20 16:16:38.612041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.944 #25 NEW cov: 11740 ft: 13107 corp: 5/55b lim: 35 exec/s: 0 rss: 66Mb L: 17/17 MS: 1 CopyPart- 00:08:09.944 [2024-07-20 16:16:38.652279] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.944 [2024-07-20 16:16:38.652308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.944 [2024-07-20 16:16:38.652366] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.944 [2024-07-20 16:16:38.652381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.944 [2024-07-20 16:16:38.652439] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.944 [2024-07-20 16:16:38.652457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.944 #26 NEW cov: 11747 ft: 13477 corp: 6/81b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:08:09.944 #27 NEW cov: 11747 ft: 13564 corp: 7/89b lim: 35 exec/s: 0 rss: 67Mb L: 8/26 MS: 1 EraseBytes- 00:08:09.944 [2024-07-20 16:16:38.732362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.944 [2024-07-20 16:16:38.732387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.202 #28 NEW cov: 11747 ft: 13668 corp: 8/103b lim: 35 exec/s: 0 rss: 67Mb L: 14/26 MS: 1 ChangeByte- 00:08:10.202 [2024-07-20 16:16:38.772629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.202 [2024-07-20 16:16:38.772656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.202 [2024-07-20 16:16:38.772716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.202 [2024-07-20 16:16:38.772731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.202 [2024-07-20 16:16:38.772800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.202 [2024-07-20 16:16:38.772814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.202 #29 NEW cov: 11747 ft: 13722 corp: 9/129b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 ChangeBit- 00:08:10.202 [2024-07-20 16:16:38.812756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.202 [2024-07-20 16:16:38.812784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.202 [2024-07-20 16:16:38.812842] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.202 [2024-07-20 16:16:38.812857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.202 [2024-07-20 16:16:38.812915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.202 [2024-07-20 16:16:38.812928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.202 #30 NEW cov: 11747 ft: 13758 corp: 10/155b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 ChangeByte- 00:08:10.202 #31 NEW cov: 11747 ft: 13810 corp: 11/164b lim: 35 exec/s: 0 rss: 67Mb L: 9/26 MS: 1 ChangeBit- 00:08:10.202 [2024-07-20 16:16:38.882949] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:38.882979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.203 [2024-07-20 16:16:38.883064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:38.883081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.203 [2024-07-20 16:16:38.883137] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:38.883150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.203 #32 NEW cov: 11747 ft: 13825 corp: 12/190b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 CopyPart- 00:08:10.203 [2024-07-20 16:16:38.923070] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:38.923099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.203 [2024-07-20 16:16:38.923178] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:38.923195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.203 [2024-07-20 16:16:38.923252] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:38.923267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.203 #33 NEW cov: 11747 ft: 13848 corp: 13/216b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 ShuffleBytes- 00:08:10.203 #34 NEW cov: 11747 ft: 13932 corp: 14/226b lim: 35 exec/s: 0 rss: 67Mb L: 10/26 MS: 1 EraseBytes- 00:08:10.203 [2024-07-20 16:16:39.003427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:39.003460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.203 [2024-07-20 16:16:39.003519] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:39.003536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.203 [2024-07-20 16:16:39.003594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:39.003608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.203 [2024-07-20 16:16:39.003667] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.203 [2024-07-20 16:16:39.003681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.460 #35 NEW cov: 11747 ft: 14221 corp: 15/259b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:10.460 #36 NEW cov: 11747 ft: 14244 corp: 16/267b lim: 35 exec/s: 0 rss: 68Mb L: 8/33 MS: 1 ChangeBit- 00:08:10.460 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:10.460 #38 NEW cov: 11770 ft: 14318 corp: 17/274b lim: 35 exec/s: 0 rss: 68Mb L: 7/33 MS: 2 EraseBytes-InsertByte- 00:08:10.460 [2024-07-20 16:16:39.133658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.460 [2024-07-20 16:16:39.133687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.460 [2024-07-20 16:16:39.133750] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.460 [2024-07-20 16:16:39.133767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.460 [2024-07-20 16:16:39.133840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.460 [2024-07-20 16:16:39.133855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.460 #39 NEW cov: 11770 ft: 14356 corp: 18/298b lim: 35 exec/s: 0 rss: 68Mb L: 24/33 MS: 1 EraseBytes- 00:08:10.460 [2024-07-20 16:16:39.173947] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.460 [2024-07-20 16:16:39.173976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.460 [2024-07-20 16:16:39.174037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.460 [2024-07-20 16:16:39.174053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.460 [2024-07-20 16:16:39.174114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.460 [2024-07-20 16:16:39.174127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.460 [2024-07-20 16:16:39.174189] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.460 [2024-07-20 16:16:39.174202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.460 #40 NEW cov: 11770 ft: 14384 corp: 19/332b lim: 35 exec/s: 40 rss: 68Mb L: 34/34 MS: 1 InsertByte- 00:08:10.461 [2024-07-20 16:16:39.214112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000006c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.461 [2024-07-20 16:16:39.214139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.461 [2024-07-20 16:16:39.214196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000006c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.461 [2024-07-20 16:16:39.214210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.461 [2024-07-20 16:16:39.214272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.461 [2024-07-20 16:16:39.214286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.461 #41 NEW cov: 11770 ft: 14428 corp: 20/362b lim: 35 exec/s: 41 rss: 68Mb L: 30/34 MS: 1 InsertRepeatedBytes- 00:08:10.461 [2024-07-20 16:16:39.253888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.461 [2024-07-20 16:16:39.253915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.717 #42 NEW cov: 11770 ft: 14453 corp: 21/380b lim: 35 exec/s: 42 rss: 68Mb L: 18/34 MS: 1 CMP- DE: "\000\000\000\327"- 00:08:10.717 #43 NEW cov: 11770 ft: 14490 corp: 22/387b lim: 35 exec/s: 43 rss: 68Mb L: 7/34 MS: 1 EraseBytes- 00:08:10.717 [2024-07-20 16:16:39.334288] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.717 [2024-07-20 16:16:39.334314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.717 [2024-07-20 16:16:39.334393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.717 [2024-07-20 16:16:39.334407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.717 #44 NEW cov: 11770 ft: 14603 corp: 23/411b lim: 35 exec/s: 44 rss: 68Mb L: 24/34 MS: 1 CrossOver- 00:08:10.717 #45 NEW cov: 11770 ft: 14635 corp: 24/419b lim: 35 exec/s: 45 rss: 68Mb L: 8/34 MS: 1 PersAutoDict- DE: "\000\000\000\327"- 00:08:10.717 [2024-07-20 16:16:39.414715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.717 [2024-07-20 16:16:39.414741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.717 [2024-07-20 16:16:39.414802] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.717 [2024-07-20 16:16:39.414816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.717 [2024-07-20 16:16:39.414875] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000070 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.717 [2024-07-20 16:16:39.414888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.717 #46 NEW cov: 11770 ft: 14658 corp: 25/453b lim: 35 exec/s: 46 rss: 68Mb L: 34/34 MS: 1 CopyPart- 00:08:10.717 [2024-07-20 16:16:39.454293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.717 [2024-07-20 16:16:39.454320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.717 #47 NEW cov: 11770 ft: 14693 corp: 26/462b lim: 35 exec/s: 47 rss: 68Mb L: 9/34 MS: 1 InsertByte- 00:08:10.717 #48 NEW cov: 11770 ft: 14786 corp: 27/470b lim: 35 exec/s: 48 rss: 68Mb L: 8/34 MS: 1 ChangeByte- 00:08:10.974 #49 NEW cov: 11770 ft: 14871 corp: 28/489b lim: 35 exec/s: 49 rss: 68Mb L: 19/34 MS: 1 CrossOver- 00:08:10.974 [2024-07-20 16:16:39.575130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.974 [2024-07-20 16:16:39.575157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.974 [2024-07-20 16:16:39.575234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.575251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.975 [2024-07-20 16:16:39.575311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.575324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.975 [2024-07-20 16:16:39.575385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000070 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.575399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.975 #50 NEW cov: 11770 ft: 14896 corp: 29/519b lim: 35 exec/s: 50 rss: 68Mb L: 30/34 MS: 1 PersAutoDict- DE: "\000\000\000\327"- 00:08:10.975 #51 NEW cov: 11770 ft: 14901 corp: 30/530b lim: 35 exec/s: 51 rss: 68Mb L: 11/34 MS: 1 CopyPart- 00:08:10.975 [2024-07-20 16:16:39.644807] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.644835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.975 #52 NEW cov: 11770 ft: 14914 corp: 31/539b lim: 35 exec/s: 52 rss: 69Mb L: 9/34 MS: 1 CopyPart- 00:08:10.975 [2024-07-20 16:16:39.685291] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.685318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.975 [2024-07-20 16:16:39.685377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.685394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.975 [2024-07-20 16:16:39.685476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.685490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.975 #53 NEW cov: 11770 ft: 14921 corp: 32/565b lim: 35 exec/s: 53 rss: 69Mb L: 26/34 MS: 1 ShuffleBytes- 00:08:10.975 #54 NEW cov: 11770 ft: 14922 corp: 33/575b lim: 35 exec/s: 54 rss: 69Mb L: 10/34 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000p"- 00:08:10.975 [2024-07-20 16:16:39.755456] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.755485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.975 [2024-07-20 16:16:39.755546] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.755562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.975 [2024-07-20 16:16:39.755619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.975 [2024-07-20 16:16:39.755633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.975 #55 NEW cov: 11770 ft: 14962 corp: 34/602b lim: 35 exec/s: 55 rss: 69Mb L: 27/34 MS: 1 InsertByte- 00:08:11.232 #56 NEW cov: 11770 ft: 14974 corp: 35/613b lim: 35 exec/s: 56 rss: 69Mb L: 11/34 MS: 1 ChangeBit- 00:08:11.232 #57 NEW cov: 11770 ft: 14975 corp: 36/624b lim: 35 exec/s: 57 rss: 69Mb L: 11/34 MS: 1 InsertByte- 00:08:11.232 [2024-07-20 16:16:39.876038] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.232 [2024-07-20 16:16:39.876064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.233 [2024-07-20 16:16:39.876123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.233 [2024-07-20 16:16:39.876139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.233 [2024-07-20 16:16:39.876197] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.233 [2024-07-20 16:16:39.876213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.233 #58 NEW cov: 11770 ft: 15026 corp: 37/655b lim: 35 exec/s: 58 rss: 69Mb L: 31/34 MS: 1 InsertRepeatedBytes- 00:08:11.233 [2024-07-20 16:16:39.916077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.233 [2024-07-20 16:16:39.916105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.233 [2024-07-20 16:16:39.916172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.233 [2024-07-20 16:16:39.916188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.233 [2024-07-20 16:16:39.916243] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.233 [2024-07-20 16:16:39.916256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.233 [2024-07-20 16:16:39.916318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.233 [2024-07-20 16:16:39.916331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.233 #59 NEW cov: 11770 ft: 15049 corp: 38/688b lim: 35 exec/s: 59 rss: 69Mb L: 33/34 MS: 1 CrossOver- 00:08:11.233 #60 NEW cov: 11770 ft: 15067 corp: 39/700b lim: 35 exec/s: 60 rss: 69Mb L: 12/34 MS: 1 InsertByte- 00:08:11.233 [2024-07-20 16:16:39.996015] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.233 [2024-07-20 16:16:39.996041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.490 #61 NEW cov: 11770 ft: 15104 corp: 40/714b lim: 35 exec/s: 61 rss: 69Mb L: 14/34 MS: 1 CrossOver- 00:08:11.490 #62 NEW cov: 11770 ft: 15302 corp: 41/721b lim: 35 exec/s: 62 rss: 69Mb L: 7/34 MS: 1 EraseBytes- 00:08:11.490 [2024-07-20 16:16:40.127410] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.490 [2024-07-20 16:16:40.127447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.490 [2024-07-20 16:16:40.127594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.490 [2024-07-20 16:16:40.127614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.490 #63 NEW cov: 11770 ft: 15324 corp: 42/735b lim: 35 exec/s: 63 rss: 69Mb L: 14/34 MS: 1 CrossOver- 00:08:11.490 [2024-07-20 16:16:40.177240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.490 [2024-07-20 16:16:40.177276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.490 #64 pulse cov: 11770 ft: 15366 corp: 42/735b lim: 35 exec/s: 32 rss: 69Mb 00:08:11.490 #64 NEW cov: 11770 ft: 15366 corp: 43/744b lim: 35 exec/s: 32 rss: 69Mb L: 9/34 MS: 1 ChangeByte- 00:08:11.490 #64 DONE cov: 11770 ft: 15366 corp: 43/744b lim: 35 exec/s: 32 rss: 69Mb 00:08:11.490 ###### Recommended dictionary. ###### 00:08:11.490 "\000\000\000\000\000\000\000p" # Uses: 1 00:08:11.490 "\000\000\000\327" # Uses: 2 00:08:11.490 ###### End of recommended dictionary. ###### 00:08:11.490 Done 64 runs in 2 second(s) 00:08:11.747 16:16:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:11.747 16:16:40 -- ../common.sh@72 -- # (( i++ )) 00:08:11.747 16:16:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.747 16:16:40 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:11.747 16:16:40 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:11.747 16:16:40 -- nvmf/run.sh@24 -- # local timen=1 00:08:11.747 16:16:40 -- nvmf/run.sh@25 -- # local core=0x1 00:08:11.747 16:16:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:11.747 16:16:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:11.747 16:16:40 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:11.747 16:16:40 -- nvmf/run.sh@29 -- # port=4415 00:08:11.748 16:16:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:11.748 16:16:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:11.748 16:16:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:11.748 16:16:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:11.748 [2024-07-20 16:16:40.360475] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:11.748 [2024-07-20 16:16:40.360567] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2276066 ] 00:08:11.748 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.748 [2024-07-20 16:16:40.534747] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.005 [2024-07-20 16:16:40.554147] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:12.005 [2024-07-20 16:16:40.554276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.005 [2024-07-20 16:16:40.605870] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.005 [2024-07-20 16:16:40.622182] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:12.005 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.005 INFO: Seed: 3175074827 00:08:12.005 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:12.005 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:12.005 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:12.005 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.005 #2 INITED exec/s: 0 rss: 60Mb 00:08:12.005 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.005 This may also happen if the target rejected all inputs we tried so far 00:08:12.005 [2024-07-20 16:16:40.688823] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.005 [2024-07-20 16:16:40.688863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.005 [2024-07-20 16:16:40.689010] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.005 [2024-07-20 16:16:40.689029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.263 NEW_FUNC[1/671]: 0x4a6dc0 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:12.263 NEW_FUNC[2/671]: 0x4c6c20 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:12.263 #3 NEW cov: 11505 ft: 11506 corp: 2/23b lim: 35 exec/s: 0 rss: 66Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:12.263 [2024-07-20 16:16:41.029670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.263 [2024-07-20 16:16:41.029711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.263 [2024-07-20 16:16:41.029859] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.263 [2024-07-20 16:16:41.029878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.263 #4 NEW cov: 11618 ft: 12265 corp: 3/45b lim: 35 exec/s: 0 rss: 66Mb L: 22/22 MS: 1 CopyPart- 00:08:12.520 [2024-07-20 16:16:41.079559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.520 [2024-07-20 16:16:41.079594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.521 [2024-07-20 16:16:41.079741] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.079759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.521 #5 NEW cov: 11624 ft: 12452 corp: 4/67b lim: 35 exec/s: 0 rss: 66Mb L: 22/22 MS: 1 ChangeBinInt- 00:08:12.521 [2024-07-20 16:16:41.119264] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.119296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.521 [2024-07-20 16:16:41.119432] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.119457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.521 #11 NEW cov: 11709 ft: 12716 corp: 5/89b lim: 35 exec/s: 0 rss: 66Mb L: 22/22 MS: 1 ChangeByte- 00:08:12.521 [2024-07-20 16:16:41.159833] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.159863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.521 [2024-07-20 16:16:41.160007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.160022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.521 #12 NEW cov: 11709 ft: 12920 corp: 6/110b lim: 35 exec/s: 0 rss: 66Mb L: 21/22 MS: 1 EraseBytes- 00:08:12.521 [2024-07-20 16:16:41.200065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.200093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.521 [2024-07-20 16:16:41.200230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.200248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.521 #13 NEW cov: 11709 ft: 12975 corp: 7/131b lim: 35 exec/s: 0 rss: 67Mb L: 21/22 MS: 1 EraseBytes- 00:08:12.521 [2024-07-20 16:16:41.240405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.240433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.521 [2024-07-20 16:16:41.240563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.240582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.521 [2024-07-20 16:16:41.240703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.240721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.521 #19 NEW cov: 11709 ft: 13437 corp: 8/162b lim: 35 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 CrossOver- 00:08:12.521 [2024-07-20 16:16:41.280299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.280333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.521 [2024-07-20 16:16:41.280475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.280508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.521 #20 NEW cov: 11709 ft: 13481 corp: 9/184b lim: 35 exec/s: 0 rss: 67Mb L: 22/31 MS: 1 ChangeByte- 00:08:12.521 [2024-07-20 16:16:41.320452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.521 [2024-07-20 16:16:41.320493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.779 NEW_FUNC[1/2]: 0x4c5fa0 in feat_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:332 00:08:12.779 NEW_FUNC[2/2]: 0x1155fc0 in nvmf_ctrlr_get_features_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1592 00:08:12.779 #21 NEW cov: 11758 ft: 13729 corp: 10/206b lim: 35 exec/s: 0 rss: 67Mb L: 22/31 MS: 1 ChangeBinInt- 00:08:12.779 [2024-07-20 16:16:41.360179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000724 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.360208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.779 [2024-07-20 16:16:41.360348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007eb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.360367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.779 #24 NEW cov: 11758 ft: 13884 corp: 11/222b lim: 35 exec/s: 0 rss: 67Mb L: 16/31 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:12.779 [2024-07-20 16:16:41.400710] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.400741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.779 [2024-07-20 16:16:41.400884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.400902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.779 #26 NEW cov: 11758 ft: 13911 corp: 12/248b lim: 35 exec/s: 0 rss: 67Mb L: 26/31 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:12.779 [2024-07-20 16:16:41.441027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.441058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.779 [2024-07-20 16:16:41.441196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.441213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.779 [2024-07-20 16:16:41.441348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.441367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.779 #27 NEW cov: 11758 ft: 13979 corp: 13/282b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CopyPart- 00:08:12.779 [2024-07-20 16:16:41.490728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.490760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.779 [2024-07-20 16:16:41.490883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.490901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.779 #28 NEW cov: 11758 ft: 13997 corp: 14/304b lim: 35 exec/s: 0 rss: 67Mb L: 22/34 MS: 1 ShuffleBytes- 00:08:12.779 [2024-07-20 16:16:41.541302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.541330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.779 [2024-07-20 16:16:41.541467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.541482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.779 [2024-07-20 16:16:41.541611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.541629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.779 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:12.779 #29 NEW cov: 11781 ft: 14060 corp: 15/332b lim: 35 exec/s: 0 rss: 67Mb L: 28/34 MS: 1 InsertRepeatedBytes- 00:08:12.779 [2024-07-20 16:16:41.581171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.581198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.779 [2024-07-20 16:16:41.581336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.779 [2024-07-20 16:16:41.581351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.037 #30 NEW cov: 11781 ft: 14079 corp: 16/353b lim: 35 exec/s: 0 rss: 67Mb L: 21/34 MS: 1 ChangeBit- 00:08:13.038 [2024-07-20 16:16:41.621569] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.621597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.621725] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.621744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.621873] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.621891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.038 #31 NEW cov: 11781 ft: 14169 corp: 17/384b lim: 35 exec/s: 0 rss: 67Mb L: 31/34 MS: 1 ChangeBit- 00:08:13.038 [2024-07-20 16:16:41.671402] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.671433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.671563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.671584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.038 #32 NEW cov: 11781 ft: 14216 corp: 18/405b lim: 35 exec/s: 32 rss: 68Mb L: 21/34 MS: 1 ChangeByte- 00:08:13.038 [2024-07-20 16:16:41.711402] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.711430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.711568] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.711588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.711714] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.711731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.711846] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007eb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.711862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.038 #33 NEW cov: 11781 ft: 14426 corp: 19/437b lim: 35 exec/s: 33 rss: 68Mb L: 32/34 MS: 1 CrossOver- 00:08:13.038 [2024-07-20 16:16:41.751901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.751929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.752075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.752093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.752230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.752246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.038 #34 NEW cov: 11781 ft: 14460 corp: 20/468b lim: 35 exec/s: 34 rss: 68Mb L: 31/34 MS: 1 ChangeByte- 00:08:13.038 [2024-07-20 16:16:41.791911] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.791940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.792074] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.792093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.792240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.792259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.038 #35 NEW cov: 11781 ft: 14517 corp: 21/496b lim: 35 exec/s: 35 rss: 68Mb L: 28/34 MS: 1 ShuffleBytes- 00:08:13.038 [2024-07-20 16:16:41.831870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.831897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.038 [2024-07-20 16:16:41.832032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.038 [2024-07-20 16:16:41.832051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.338 #36 NEW cov: 11781 ft: 14547 corp: 22/517b lim: 35 exec/s: 36 rss: 68Mb L: 21/34 MS: 1 ChangeBinInt- 00:08:13.338 [2024-07-20 16:16:41.872051] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.338 [2024-07-20 16:16:41.872080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.338 [2024-07-20 16:16:41.872219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.338 [2024-07-20 16:16:41.872235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.338 #37 NEW cov: 11781 ft: 14549 corp: 23/541b lim: 35 exec/s: 37 rss: 68Mb L: 24/34 MS: 1 CopyPart- 00:08:13.338 [2024-07-20 16:16:41.912230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000724 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.338 [2024-07-20 16:16:41.912258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.338 [2024-07-20 16:16:41.912399] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.338 [2024-07-20 16:16:41.912415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.338 [2024-07-20 16:16:41.912549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.912567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:41.912702] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007eb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.912720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.339 #38 NEW cov: 11781 ft: 14558 corp: 24/571b lim: 35 exec/s: 38 rss: 68Mb L: 30/34 MS: 1 InsertRepeatedBytes- 00:08:13.339 [2024-07-20 16:16:41.952404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.952434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:41.952568] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.952587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:41.952715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.952730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.339 #39 NEW cov: 11781 ft: 14605 corp: 25/599b lim: 35 exec/s: 39 rss: 68Mb L: 28/34 MS: 1 ChangeByte- 00:08:13.339 [2024-07-20 16:16:41.992806] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.992834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:41.992966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.992986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:41.993123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.993142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:41.993289] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:41.993308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.339 #40 NEW cov: 11781 ft: 14652 corp: 26/634b lim: 35 exec/s: 40 rss: 68Mb L: 35/35 MS: 1 CrossOver- 00:08:13.339 [2024-07-20 16:16:42.042530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:42.042558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:42.042692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:42.042710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.339 #41 NEW cov: 11781 ft: 14706 corp: 27/656b lim: 35 exec/s: 41 rss: 68Mb L: 22/35 MS: 1 ChangeBit- 00:08:13.339 [2024-07-20 16:16:42.082782] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:42.082813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:42.082959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:42.082978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.339 [2024-07-20 16:16:42.083114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.339 [2024-07-20 16:16:42.083131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.635 #42 NEW cov: 11781 ft: 14789 corp: 28/687b lim: 35 exec/s: 42 rss: 69Mb L: 31/35 MS: 1 ChangeBit- 00:08:13.635 #48 NEW cov: 11781 ft: 15003 corp: 29/696b lim: 35 exec/s: 48 rss: 69Mb L: 9/35 MS: 1 CMP- DE: "\001.\362n\355\022c&"- 00:08:13.635 [2024-07-20 16:16:42.162980] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.163010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.635 [2024-07-20 16:16:42.163148] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.163167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.635 #49 NEW cov: 11781 ft: 15013 corp: 30/719b lim: 35 exec/s: 49 rss: 69Mb L: 23/35 MS: 1 EraseBytes- 00:08:13.635 [2024-07-20 16:16:42.203067] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.203097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.635 [2024-07-20 16:16:42.203238] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.203258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.635 [2024-07-20 16:16:42.203395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.203413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.635 #50 NEW cov: 11781 ft: 15042 corp: 31/747b lim: 35 exec/s: 50 rss: 69Mb L: 28/35 MS: 1 ShuffleBytes- 00:08:13.635 [2024-07-20 16:16:42.253358] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.253388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.635 [2024-07-20 16:16:42.253533] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.253553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.635 [2024-07-20 16:16:42.253692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000365 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.253711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.635 #51 NEW cov: 11781 ft: 15065 corp: 32/780b lim: 35 exec/s: 51 rss: 69Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:08:13.635 [2024-07-20 16:16:42.293276] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.293306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.635 [2024-07-20 16:16:42.293449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.293468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.635 #52 NEW cov: 11781 ft: 15087 corp: 33/803b lim: 35 exec/s: 52 rss: 69Mb L: 23/35 MS: 1 ChangeByte- 00:08:13.635 [2024-07-20 16:16:42.343447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.343476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.635 [2024-07-20 16:16:42.343624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.635 [2024-07-20 16:16:42.343641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.635 #55 NEW cov: 11781 ft: 15124 corp: 34/825b lim: 35 exec/s: 55 rss: 69Mb L: 22/35 MS: 3 EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:13.635 [2024-07-20 16:16:42.393280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.636 [2024-07-20 16:16:42.393311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.636 #56 NEW cov: 11781 ft: 15128 corp: 35/841b lim: 35 exec/s: 56 rss: 69Mb L: 16/35 MS: 1 EraseBytes- 00:08:13.893 [2024-07-20 16:16:42.433802] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.893 [2024-07-20 16:16:42.433831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.893 [2024-07-20 16:16:42.433969] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.893 [2024-07-20 16:16:42.433986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.893 [2024-07-20 16:16:42.434130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.893 [2024-07-20 16:16:42.434148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.893 #57 NEW cov: 11781 ft: 15137 corp: 36/875b lim: 35 exec/s: 57 rss: 69Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:13.893 [2024-07-20 16:16:42.474029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.893 [2024-07-20 16:16:42.474057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.893 [2024-07-20 16:16:42.474195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.893 [2024-07-20 16:16:42.474214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.893 [2024-07-20 16:16:42.474348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000073f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.893 [2024-07-20 16:16:42.474367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.893 [2024-07-20 16:16:42.474497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.893 [2024-07-20 16:16:42.474515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.893 [2024-07-20 16:16:42.474652] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.474671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.894 #58 NEW cov: 11781 ft: 15163 corp: 37/910b lim: 35 exec/s: 58 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:08:13.894 [2024-07-20 16:16:42.524342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000724 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.524371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.894 [2024-07-20 16:16:42.524489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.524506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.894 [2024-07-20 16:16:42.524633] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.524650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.894 [2024-07-20 16:16:42.524769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007eb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.524785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.894 [2024-07-20 16:16:42.524914] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007eb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.524930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.894 #59 NEW cov: 11781 ft: 15168 corp: 38/945b lim: 35 exec/s: 59 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:08:13.894 [2024-07-20 16:16:42.574185] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.574216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.894 [2024-07-20 16:16:42.574345] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.574375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.894 [2024-07-20 16:16:42.574516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.574536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.894 #60 NEW cov: 11781 ft: 15202 corp: 39/978b lim: 35 exec/s: 60 rss: 69Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:08:13.894 #61 NEW cov: 11781 ft: 15216 corp: 40/989b lim: 35 exec/s: 61 rss: 69Mb L: 11/35 MS: 1 CrossOver- 00:08:13.894 [2024-07-20 16:16:42.654275] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.654306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.894 [2024-07-20 16:16:42.654437] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-07-20 16:16:42.654459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.894 #62 NEW cov: 11781 ft: 15283 corp: 41/1010b lim: 35 exec/s: 31 rss: 69Mb L: 21/35 MS: 1 ChangeBinInt- 00:08:13.894 #62 DONE cov: 11781 ft: 15283 corp: 41/1010b lim: 35 exec/s: 31 rss: 69Mb 00:08:13.894 ###### Recommended dictionary. ###### 00:08:13.894 "\001.\362n\355\022c&" # Uses: 0 00:08:13.894 ###### End of recommended dictionary. ###### 00:08:13.894 Done 62 runs in 2 second(s) 00:08:14.153 16:16:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:14.153 16:16:42 -- ../common.sh@72 -- # (( i++ )) 00:08:14.153 16:16:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.153 16:16:42 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:14.153 16:16:42 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:14.153 16:16:42 -- nvmf/run.sh@24 -- # local timen=1 00:08:14.153 16:16:42 -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.153 16:16:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:14.153 16:16:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:14.153 16:16:42 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:14.153 16:16:42 -- nvmf/run.sh@29 -- # port=4416 00:08:14.153 16:16:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:14.153 16:16:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:14.153 16:16:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.153 16:16:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:14.153 [2024-07-20 16:16:42.831032] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:14.153 [2024-07-20 16:16:42.831126] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2276607 ] 00:08:14.153 EAL: No free 2048 kB hugepages reported on node 1 00:08:14.410 [2024-07-20 16:16:43.008148] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.410 [2024-07-20 16:16:43.028238] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:14.410 [2024-07-20 16:16:43.028365] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.410 [2024-07-20 16:16:43.079814] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:14.410 [2024-07-20 16:16:43.096129] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:14.410 INFO: Running with entropic power schedule (0xFF, 100). 00:08:14.410 INFO: Seed: 1354085212 00:08:14.410 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:14.410 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:14.410 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:14.410 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.410 #2 INITED exec/s: 0 rss: 59Mb 00:08:14.410 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:14.410 This may also happen if the target rejected all inputs we tried so far 00:08:14.410 [2024-07-20 16:16:43.141308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.410 [2024-07-20 16:16:43.141341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.410 [2024-07-20 16:16:43.141398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.410 [2024-07-20 16:16:43.141415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.667 NEW_FUNC[1/671]: 0x4a8270 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:14.667 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:14.667 #4 NEW cov: 11594 ft: 11595 corp: 2/56b lim: 105 exec/s: 0 rss: 66Mb L: 55/55 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:14.667 [2024-07-20 16:16:43.452071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.667 [2024-07-20 16:16:43.452104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.667 [2024-07-20 16:16:43.452161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.667 [2024-07-20 16:16:43.452177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.925 #5 NEW cov: 11707 ft: 12097 corp: 3/111b lim: 105 exec/s: 0 rss: 66Mb L: 55/55 MS: 1 ChangeByte- 00:08:14.925 [2024-07-20 16:16:43.492387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.492418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.492488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.492506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.492561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.492577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.492632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.492652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.492705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.492721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.925 #11 NEW cov: 11713 ft: 12851 corp: 4/216b lim: 105 exec/s: 0 rss: 66Mb L: 105/105 MS: 1 CrossOver- 00:08:14.925 [2024-07-20 16:16:43.532193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.532221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.532266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.532282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.925 #12 NEW cov: 11798 ft: 13091 corp: 5/271b lim: 105 exec/s: 0 rss: 66Mb L: 55/105 MS: 1 ChangeBit- 00:08:14.925 [2024-07-20 16:16:43.572320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.572348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.572405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.572421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.925 #13 NEW cov: 11798 ft: 13138 corp: 6/323b lim: 105 exec/s: 0 rss: 66Mb L: 52/105 MS: 1 EraseBytes- 00:08:14.925 [2024-07-20 16:16:43.602400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551611 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.602427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.602475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.602489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.925 #14 NEW cov: 11798 ft: 13220 corp: 7/378b lim: 105 exec/s: 0 rss: 66Mb L: 55/105 MS: 1 ChangeBit- 00:08:14.925 [2024-07-20 16:16:43.642513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.642543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.642577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.642593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.925 #15 NEW cov: 11798 ft: 13286 corp: 8/432b lim: 105 exec/s: 0 rss: 66Mb L: 54/105 MS: 1 EraseBytes- 00:08:14.925 [2024-07-20 16:16:43.672974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.673004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.673042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.673058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.673112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.673128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.673182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.673198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.673256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.673271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.925 #16 NEW cov: 11798 ft: 13328 corp: 9/537b lim: 105 exec/s: 0 rss: 67Mb L: 105/105 MS: 1 ChangeBit- 00:08:14.925 [2024-07-20 16:16:43.712743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65408 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.712772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.925 [2024-07-20 16:16:43.712823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.925 [2024-07-20 16:16:43.712840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.183 #17 NEW cov: 11798 ft: 13395 corp: 10/589b lim: 105 exec/s: 0 rss: 67Mb L: 52/105 MS: 1 ChangeBit- 00:08:15.183 [2024-07-20 16:16:43.752853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.752882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.752931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.752949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.183 #18 NEW cov: 11798 ft: 13460 corp: 11/651b lim: 105 exec/s: 0 rss: 67Mb L: 62/105 MS: 1 InsertRepeatedBytes- 00:08:15.183 [2024-07-20 16:16:43.792955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071864057855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.792983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.793025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.793040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.183 #22 NEW cov: 11798 ft: 13486 corp: 12/694b lim: 105 exec/s: 0 rss: 67Mb L: 43/105 MS: 4 CopyPart-ChangeByte-ChangeBinInt-CrossOver- 00:08:15.183 [2024-07-20 16:16:43.833409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.833440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.833493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551359 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.833510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.833566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.833583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.833634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.833649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.833704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.833719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.183 #23 NEW cov: 11798 ft: 13514 corp: 13/799b lim: 105 exec/s: 0 rss: 67Mb L: 105/105 MS: 1 ChangeBinInt- 00:08:15.183 [2024-07-20 16:16:43.873243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65408 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.873271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.873307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.873323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.873377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:30070 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.873393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.183 #24 NEW cov: 11798 ft: 13801 corp: 14/865b lim: 105 exec/s: 0 rss: 67Mb L: 66/105 MS: 1 InsertRepeatedBytes- 00:08:15.183 [2024-07-20 16:16:43.913615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.913643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.913692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.913709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.913764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.913782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.913836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.183 [2024-07-20 16:16:43.913855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.183 [2024-07-20 16:16:43.913910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.184 [2024-07-20 16:16:43.913926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.184 #25 NEW cov: 11798 ft: 13829 corp: 15/970b lim: 105 exec/s: 0 rss: 67Mb L: 105/105 MS: 1 CopyPart- 00:08:15.184 [2024-07-20 16:16:43.953390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551611 len:15360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.184 [2024-07-20 16:16:43.953420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.184 [2024-07-20 16:16:43.953459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.184 [2024-07-20 16:16:43.953475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.184 #26 NEW cov: 11798 ft: 13855 corp: 16/1026b lim: 105 exec/s: 0 rss: 67Mb L: 56/105 MS: 1 InsertByte- 00:08:15.441 [2024-07-20 16:16:43.993653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65408 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:43.993682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:43.993723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:43.993741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:43.993796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:43.993813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.441 #27 NEW cov: 11798 ft: 13859 corp: 17/1107b lim: 105 exec/s: 0 rss: 67Mb L: 81/105 MS: 1 InsertRepeatedBytes- 00:08:15.441 [2024-07-20 16:16:44.033633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071864057855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.033663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.033704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.033720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.441 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:15.441 #28 NEW cov: 11821 ft: 13905 corp: 18/1150b lim: 105 exec/s: 0 rss: 68Mb L: 43/105 MS: 1 ShuffleBytes- 00:08:15.441 [2024-07-20 16:16:44.074104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.074132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.074181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.074201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.074255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.074272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.074327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.074344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.074400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.074414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.441 #29 NEW cov: 11821 ft: 14006 corp: 19/1255b lim: 105 exec/s: 0 rss: 68Mb L: 105/105 MS: 1 ShuffleBytes- 00:08:15.441 [2024-07-20 16:16:44.113867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.113895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.113929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.113945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.441 #30 NEW cov: 11821 ft: 14033 corp: 20/1310b lim: 105 exec/s: 30 rss: 68Mb L: 55/105 MS: 1 CrossOver- 00:08:15.441 [2024-07-20 16:16:44.154335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.154363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.154414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.154430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.154488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.154505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.441 [2024-07-20 16:16:44.154561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.441 [2024-07-20 16:16:44.154576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.442 [2024-07-20 16:16:44.154633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039020974079 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.442 [2024-07-20 16:16:44.154649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.442 #31 NEW cov: 11821 ft: 14052 corp: 21/1415b lim: 105 exec/s: 31 rss: 68Mb L: 105/105 MS: 1 ChangeByte- 00:08:15.442 [2024-07-20 16:16:44.194159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071864057855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.442 [2024-07-20 16:16:44.194190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.442 [2024-07-20 16:16:44.194222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.442 [2024-07-20 16:16:44.194237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.442 [2024-07-20 16:16:44.194292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551487 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.442 [2024-07-20 16:16:44.194307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.442 #32 NEW cov: 11821 ft: 14063 corp: 22/1491b lim: 105 exec/s: 32 rss: 68Mb L: 76/105 MS: 1 InsertRepeatedBytes- 00:08:15.442 [2024-07-20 16:16:44.234168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.442 [2024-07-20 16:16:44.234196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.442 [2024-07-20 16:16:44.234232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.442 [2024-07-20 16:16:44.234250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.699 #33 NEW cov: 11821 ft: 14133 corp: 23/1543b lim: 105 exec/s: 33 rss: 68Mb L: 52/105 MS: 1 EraseBytes- 00:08:15.699 [2024-07-20 16:16:44.274663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.274690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.274743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.274759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.274827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.274844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.274899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.274915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.274968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.274985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.699 #34 NEW cov: 11821 ft: 14154 corp: 24/1648b lim: 105 exec/s: 34 rss: 68Mb L: 105/105 MS: 1 ChangeBit- 00:08:15.699 [2024-07-20 16:16:44.314599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.314627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.314662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.314682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.314736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.314752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.699 #35 NEW cov: 11821 ft: 14216 corp: 25/1717b lim: 105 exec/s: 35 rss: 68Mb L: 69/105 MS: 1 InsertRepeatedBytes- 00:08:15.699 [2024-07-20 16:16:44.354546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.354575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.354608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.354623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.699 #36 NEW cov: 11821 ft: 14304 corp: 26/1768b lim: 105 exec/s: 36 rss: 68Mb L: 51/105 MS: 1 CrossOver- 00:08:15.699 [2024-07-20 16:16:44.394796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071864057855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.394824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.394857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.394872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.394926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551487 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.394942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.699 #37 NEW cov: 11821 ft: 14342 corp: 27/1844b lim: 105 exec/s: 37 rss: 68Mb L: 76/105 MS: 1 ShuffleBytes- 00:08:15.699 [2024-07-20 16:16:44.434984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.435010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.435077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18374686483966590975 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.435093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.435148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.435164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.699 [2024-07-20 16:16:44.435219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.699 [2024-07-20 16:16:44.435235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.699 #38 NEW cov: 11821 ft: 14355 corp: 28/1947b lim: 105 exec/s: 38 rss: 68Mb L: 103/105 MS: 1 InsertRepeatedBytes- 00:08:15.700 [2024-07-20 16:16:44.475266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.700 [2024-07-20 16:16:44.475293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.700 [2024-07-20 16:16:44.475346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.700 [2024-07-20 16:16:44.475361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.700 [2024-07-20 16:16:44.475415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.700 [2024-07-20 16:16:44.475431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.700 [2024-07-20 16:16:44.475490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.700 [2024-07-20 16:16:44.475505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.700 [2024-07-20 16:16:44.475560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.700 [2024-07-20 16:16:44.475574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.700 #39 NEW cov: 11821 ft: 14405 corp: 29/2052b lim: 105 exec/s: 39 rss: 68Mb L: 105/105 MS: 1 ChangeBit- 00:08:15.957 [2024-07-20 16:16:44.515398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.515425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.957 [2024-07-20 16:16:44.515484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.515500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.957 [2024-07-20 16:16:44.515556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.515572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.957 [2024-07-20 16:16:44.515625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.515640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.957 [2024-07-20 16:16:44.515694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039020974079 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.515712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.957 #40 NEW cov: 11821 ft: 14445 corp: 30/2157b lim: 105 exec/s: 40 rss: 68Mb L: 105/105 MS: 1 ChangeBit- 00:08:15.957 [2024-07-20 16:16:44.555266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.555293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.957 [2024-07-20 16:16:44.555326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.555340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.957 [2024-07-20 16:16:44.555396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.555412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.957 #41 NEW cov: 11821 ft: 14459 corp: 31/2236b lim: 105 exec/s: 41 rss: 69Mb L: 79/105 MS: 1 EraseBytes- 00:08:15.957 [2024-07-20 16:16:44.595648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.957 [2024-07-20 16:16:44.595676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.957 [2024-07-20 16:16:44.595724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.595741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.595795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744070052118527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.595812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.595867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.595884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.595941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.595957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.958 #42 NEW cov: 11821 ft: 14472 corp: 32/2341b lim: 105 exec/s: 42 rss: 69Mb L: 105/105 MS: 1 CrossOver- 00:08:15.958 [2024-07-20 16:16:44.635530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.635557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.635593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551370 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.635608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.635664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.635680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.958 #43 NEW cov: 11821 ft: 14489 corp: 33/2417b lim: 105 exec/s: 43 rss: 69Mb L: 76/105 MS: 1 CopyPart- 00:08:15.958 [2024-07-20 16:16:44.675631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65408 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.675658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.675692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.675708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.675764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.675781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.958 #44 NEW cov: 11821 ft: 14498 corp: 34/2498b lim: 105 exec/s: 44 rss: 69Mb L: 81/105 MS: 1 ShuffleBytes- 00:08:15.958 [2024-07-20 16:16:44.715740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071864057855 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.715768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.715804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.715821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.958 [2024-07-20 16:16:44.715876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551487 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.715892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.958 #45 NEW cov: 11821 ft: 14514 corp: 35/2574b lim: 105 exec/s: 45 rss: 69Mb L: 76/105 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:15.958 [2024-07-20 16:16:44.755582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551611 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.958 [2024-07-20 16:16:44.755609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.216 #46 NEW cov: 11821 ft: 14947 corp: 36/2608b lim: 105 exec/s: 46 rss: 69Mb L: 34/105 MS: 1 EraseBytes- 00:08:16.216 [2024-07-20 16:16:44.796180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.796209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.796258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.796275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.796327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.796343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.796396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.796410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.796466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.796481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.216 #47 NEW cov: 11821 ft: 14953 corp: 37/2713b lim: 105 exec/s: 47 rss: 69Mb L: 105/105 MS: 1 CopyPart- 00:08:16.216 [2024-07-20 16:16:44.835946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.835974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.836020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.836036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.216 #48 NEW cov: 11821 ft: 14972 corp: 38/2772b lim: 105 exec/s: 48 rss: 69Mb L: 59/105 MS: 1 InsertRepeatedBytes- 00:08:16.216 [2024-07-20 16:16:44.866011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.866039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.866074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.866090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.216 #49 NEW cov: 11821 ft: 14977 corp: 39/2831b lim: 105 exec/s: 49 rss: 69Mb L: 59/105 MS: 1 ChangeBinInt- 00:08:16.216 [2024-07-20 16:16:44.906266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.906294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.906335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.906352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.906409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.906425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.216 #50 NEW cov: 11821 ft: 14988 corp: 40/2906b lim: 105 exec/s: 50 rss: 69Mb L: 75/105 MS: 1 CrossOver- 00:08:16.216 [2024-07-20 16:16:44.946605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.946632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.946683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.946700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.946756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.946772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.946827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.946845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.946877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039020974079 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.946891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.216 #51 NEW cov: 11821 ft: 14991 corp: 41/3011b lim: 105 exec/s: 51 rss: 69Mb L: 105/105 MS: 1 ChangeBinInt- 00:08:16.216 [2024-07-20 16:16:44.986345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071864057855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.986373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:44.986420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:44.986436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.216 #52 NEW cov: 11821 ft: 15002 corp: 42/3054b lim: 105 exec/s: 52 rss: 69Mb L: 43/105 MS: 1 ChangeBit- 00:08:16.216 [2024-07-20 16:16:45.016833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:45.016860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:45.016910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551359 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:45.016928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:45.016983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:45.017000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:45.017057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:45.017072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.216 [2024-07-20 16:16:45.017128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1945555039024054271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.216 [2024-07-20 16:16:45.017145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.474 #53 NEW cov: 11821 ft: 15005 corp: 43/3159b lim: 105 exec/s: 53 rss: 69Mb L: 105/105 MS: 1 ShuffleBytes- 00:08:16.474 [2024-07-20 16:16:45.056627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.474 [2024-07-20 16:16:45.056655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.474 [2024-07-20 16:16:45.056691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16140901064495857663 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.474 [2024-07-20 16:16:45.056707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.474 [2024-07-20 16:16:45.056765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.474 [2024-07-20 16:16:45.056779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.474 #54 NEW cov: 11821 ft: 15024 corp: 44/3228b lim: 105 exec/s: 54 rss: 69Mb L: 69/105 MS: 1 ChangeBit- 00:08:16.474 [2024-07-20 16:16:45.096662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65318 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.474 [2024-07-20 16:16:45.096689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.474 [2024-07-20 16:16:45.096739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744071041974271 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.474 [2024-07-20 16:16:45.096756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.474 #55 NEW cov: 11821 ft: 15041 corp: 45/3287b lim: 105 exec/s: 55 rss: 69Mb L: 59/105 MS: 1 ChangeByte- 00:08:16.474 [2024-07-20 16:16:45.136976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.474 [2024-07-20 16:16:45.137004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.474 [2024-07-20 16:16:45.137054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.474 [2024-07-20 16:16:45.137070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.475 [2024-07-20 16:16:45.137122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.475 [2024-07-20 16:16:45.137137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.475 [2024-07-20 16:16:45.137191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551397 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.475 [2024-07-20 16:16:45.137206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.475 #56 NEW cov: 11821 ft: 15048 corp: 46/3382b lim: 105 exec/s: 28 rss: 69Mb L: 95/105 MS: 1 CrossOver- 00:08:16.475 #56 DONE cov: 11821 ft: 15048 corp: 46/3382b lim: 105 exec/s: 28 rss: 69Mb 00:08:16.475 ###### Recommended dictionary. ###### 00:08:16.475 "\000\000\000\000" # Uses: 0 00:08:16.475 ###### End of recommended dictionary. ###### 00:08:16.475 Done 56 runs in 2 second(s) 00:08:16.475 16:16:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:16.475 16:16:45 -- ../common.sh@72 -- # (( i++ )) 00:08:16.475 16:16:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.475 16:16:45 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:16.475 16:16:45 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:16.475 16:16:45 -- nvmf/run.sh@24 -- # local timen=1 00:08:16.475 16:16:45 -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.475 16:16:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:16.475 16:16:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:16.475 16:16:45 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:16.475 16:16:45 -- nvmf/run.sh@29 -- # port=4417 00:08:16.475 16:16:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:16.732 16:16:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:16.732 16:16:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.732 16:16:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:16.732 [2024-07-20 16:16:45.311023] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:16.732 [2024-07-20 16:16:45.311102] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2276929 ] 00:08:16.732 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.732 [2024-07-20 16:16:45.484023] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.732 [2024-07-20 16:16:45.503178] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:16.732 [2024-07-20 16:16:45.503318] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.990 [2024-07-20 16:16:45.554732] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.990 [2024-07-20 16:16:45.571054] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:16.990 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.990 INFO: Seed: 3830085069 00:08:16.990 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:16.990 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:16.990 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:16.990 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.990 #2 INITED exec/s: 0 rss: 59Mb 00:08:16.990 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.990 This may also happen if the target rejected all inputs we tried so far 00:08:16.990 [2024-07-20 16:16:45.637083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.990 [2024-07-20 16:16:45.637123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.990 [2024-07-20 16:16:45.637236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.990 [2024-07-20 16:16:45.637256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.247 NEW_FUNC[1/672]: 0x4ab560 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:17.247 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:17.247 #6 NEW cov: 11615 ft: 11615 corp: 2/57b lim: 120 exec/s: 0 rss: 66Mb L: 56/56 MS: 4 CopyPart-EraseBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:17.247 [2024-07-20 16:16:45.957625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.247 [2024-07-20 16:16:45.957668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.247 #7 NEW cov: 11728 ft: 13051 corp: 3/90b lim: 120 exec/s: 0 rss: 66Mb L: 33/56 MS: 1 EraseBytes- 00:08:17.247 [2024-07-20 16:16:46.008011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.247 [2024-07-20 16:16:46.008043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.247 [2024-07-20 16:16:46.008139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.247 [2024-07-20 16:16:46.008166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.247 #8 NEW cov: 11734 ft: 13229 corp: 4/147b lim: 120 exec/s: 0 rss: 66Mb L: 57/57 MS: 1 InsertByte- 00:08:17.247 [2024-07-20 16:16:46.047425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:47827 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.247 [2024-07-20 16:16:46.047456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.505 #14 NEW cov: 11819 ft: 13615 corp: 5/180b lim: 120 exec/s: 0 rss: 66Mb L: 33/57 MS: 1 ChangeByte- 00:08:17.505 [2024-07-20 16:16:46.097822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.097854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.505 [2024-07-20 16:16:46.097989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.098010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.505 #15 NEW cov: 11819 ft: 13731 corp: 6/237b lim: 120 exec/s: 0 rss: 66Mb L: 57/57 MS: 1 CrossOver- 00:08:17.505 [2024-07-20 16:16:46.138297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.138331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.505 [2024-07-20 16:16:46.138450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.138473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.505 #16 NEW cov: 11819 ft: 13901 corp: 7/294b lim: 120 exec/s: 0 rss: 66Mb L: 57/57 MS: 1 ChangeBit- 00:08:17.505 [2024-07-20 16:16:46.178431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.178467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.505 [2024-07-20 16:16:46.178571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.178594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.505 #17 NEW cov: 11819 ft: 14001 corp: 8/359b lim: 120 exec/s: 0 rss: 67Mb L: 65/65 MS: 1 InsertRepeatedBytes- 00:08:17.505 [2024-07-20 16:16:46.218502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.218532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.505 [2024-07-20 16:16:46.218637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.218667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.505 #18 NEW cov: 11819 ft: 14042 corp: 9/424b lim: 120 exec/s: 0 rss: 67Mb L: 65/65 MS: 1 ChangeBinInt- 00:08:17.505 [2024-07-20 16:16:46.258203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.258235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.505 [2024-07-20 16:16:46.258309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.258333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.505 #19 NEW cov: 11819 ft: 14165 corp: 10/481b lim: 120 exec/s: 0 rss: 67Mb L: 57/65 MS: 1 ChangeByte- 00:08:17.505 [2024-07-20 16:16:46.298513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.505 [2024-07-20 16:16:46.298539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.763 #24 NEW cov: 11819 ft: 14180 corp: 11/525b lim: 120 exec/s: 0 rss: 67Mb L: 44/65 MS: 5 InsertByte-ChangeByte-ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:17.763 [2024-07-20 16:16:46.338913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.338949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.763 [2024-07-20 16:16:46.339060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.339087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.763 #25 NEW cov: 11819 ft: 14234 corp: 12/582b lim: 120 exec/s: 0 rss: 67Mb L: 57/65 MS: 1 CrossOver- 00:08:17.763 [2024-07-20 16:16:46.379045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.379076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.763 [2024-07-20 16:16:46.379151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.379179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.763 #26 NEW cov: 11819 ft: 14255 corp: 13/647b lim: 120 exec/s: 0 rss: 67Mb L: 65/65 MS: 1 CrossOver- 00:08:17.763 [2024-07-20 16:16:46.419180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.419211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.763 [2024-07-20 16:16:46.419327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191435394052969170 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.419348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.763 #27 NEW cov: 11819 ft: 14331 corp: 14/704b lim: 120 exec/s: 0 rss: 67Mb L: 57/65 MS: 1 ChangeBinInt- 00:08:17.763 [2024-07-20 16:16:46.458958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.458985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.763 #28 NEW cov: 11819 ft: 14346 corp: 15/746b lim: 120 exec/s: 0 rss: 68Mb L: 42/65 MS: 1 EraseBytes- 00:08:17.763 [2024-07-20 16:16:46.499386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.499412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.763 [2024-07-20 16:16:46.499546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:211 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.499571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.763 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:17.763 #29 NEW cov: 11842 ft: 14379 corp: 16/812b lim: 120 exec/s: 0 rss: 68Mb L: 66/66 MS: 1 InsertByte- 00:08:17.763 [2024-07-20 16:16:46.539436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.539472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.763 [2024-07-20 16:16:46.539564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.763 [2024-07-20 16:16:46.539592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.763 #30 NEW cov: 11842 ft: 14393 corp: 17/869b lim: 120 exec/s: 0 rss: 68Mb L: 57/66 MS: 1 ChangeBit- 00:08:18.021 [2024-07-20 16:16:46.579515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.579548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.021 [2024-07-20 16:16:46.579624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436296046432978 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.579648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.021 #31 NEW cov: 11842 ft: 14440 corp: 18/926b lim: 120 exec/s: 0 rss: 68Mb L: 57/66 MS: 1 ChangeBinInt- 00:08:18.021 [2024-07-20 16:16:46.619495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.619521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.021 #32 NEW cov: 11842 ft: 14462 corp: 19/968b lim: 120 exec/s: 32 rss: 68Mb L: 42/66 MS: 1 ChangeASCIIInt- 00:08:18.021 [2024-07-20 16:16:46.659857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.659891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.021 [2024-07-20 16:16:46.660010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.660036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.021 #33 NEW cov: 11842 ft: 14479 corp: 20/1025b lim: 120 exec/s: 33 rss: 68Mb L: 57/66 MS: 1 ChangeByte- 00:08:18.021 [2024-07-20 16:16:46.709756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489665746 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.709792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.021 [2024-07-20 16:16:46.709911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:211 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.709942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.021 #34 NEW cov: 11842 ft: 14511 corp: 21/1091b lim: 120 exec/s: 34 rss: 68Mb L: 66/66 MS: 1 ChangeBinInt- 00:08:18.021 [2024-07-20 16:16:46.749765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468045605681273544 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.749792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.021 #37 NEW cov: 11842 ft: 14514 corp: 22/1124b lim: 120 exec/s: 37 rss: 68Mb L: 33/66 MS: 3 InsertRepeatedBytes-InsertByte-CrossOver- 00:08:18.021 [2024-07-20 16:16:46.800106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.021 [2024-07-20 16:16:46.800136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.279 #38 NEW cov: 11842 ft: 14565 corp: 23/1159b lim: 120 exec/s: 38 rss: 68Mb L: 35/66 MS: 1 EraseBytes- 00:08:18.279 [2024-07-20 16:16:46.860222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.279 [2024-07-20 16:16:46.860254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.279 #39 NEW cov: 11842 ft: 14648 corp: 24/1201b lim: 120 exec/s: 39 rss: 69Mb L: 42/66 MS: 1 ChangeBit- 00:08:18.279 [2024-07-20 16:16:46.909946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.279 [2024-07-20 16:16:46.909972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.280 #40 NEW cov: 11842 ft: 14683 corp: 25/1238b lim: 120 exec/s: 40 rss: 69Mb L: 37/66 MS: 1 EraseBytes- 00:08:18.280 [2024-07-20 16:16:46.950511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.280 [2024-07-20 16:16:46.950543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.280 [2024-07-20 16:16:46.950650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.280 [2024-07-20 16:16:46.950675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.280 #41 NEW cov: 11842 ft: 14694 corp: 26/1295b lim: 120 exec/s: 41 rss: 69Mb L: 57/66 MS: 1 ChangeBit- 00:08:18.280 [2024-07-20 16:16:46.990658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.280 [2024-07-20 16:16:46.990693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.280 [2024-07-20 16:16:46.990808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.280 [2024-07-20 16:16:46.990830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.280 #42 NEW cov: 11842 ft: 14711 corp: 27/1363b lim: 120 exec/s: 42 rss: 69Mb L: 68/68 MS: 1 CopyPart- 00:08:18.280 [2024-07-20 16:16:47.030710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.280 [2024-07-20 16:16:47.030744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.280 #43 NEW cov: 11842 ft: 14730 corp: 28/1398b lim: 120 exec/s: 43 rss: 69Mb L: 35/68 MS: 1 ChangeByte- 00:08:18.280 [2024-07-20 16:16:47.080909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.280 [2024-07-20 16:16:47.080942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.280 [2024-07-20 16:16:47.081061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.280 [2024-07-20 16:16:47.081088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.552 #44 NEW cov: 11842 ft: 14734 corp: 29/1455b lim: 120 exec/s: 44 rss: 69Mb L: 57/68 MS: 1 ShuffleBytes- 00:08:18.552 [2024-07-20 16:16:47.131338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.131377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.552 [2024-07-20 16:16:47.131496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.131522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.552 #45 NEW cov: 11842 ft: 14803 corp: 30/1512b lim: 120 exec/s: 45 rss: 69Mb L: 57/68 MS: 1 CopyPart- 00:08:18.552 [2024-07-20 16:16:47.180990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.181025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.552 [2024-07-20 16:16:47.181144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.181172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.552 #46 NEW cov: 11842 ft: 14815 corp: 31/1570b lim: 120 exec/s: 46 rss: 69Mb L: 58/68 MS: 1 InsertByte- 00:08:18.552 [2024-07-20 16:16:47.221308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.221344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.552 [2024-07-20 16:16:47.221454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.221467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.552 #47 NEW cov: 11842 ft: 14816 corp: 32/1623b lim: 120 exec/s: 47 rss: 69Mb L: 53/68 MS: 1 EraseBytes- 00:08:18.552 [2024-07-20 16:16:47.261601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.261632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.552 [2024-07-20 16:16:47.261731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.261755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.552 #48 NEW cov: 11842 ft: 14819 corp: 33/1680b lim: 120 exec/s: 48 rss: 69Mb L: 57/68 MS: 1 ChangeByte- 00:08:18.552 [2024-07-20 16:16:47.301966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.302001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.552 [2024-07-20 16:16:47.302135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.302162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.552 [2024-07-20 16:16:47.302281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-07-20 16:16:47.302301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.552 #52 NEW cov: 11842 ft: 15206 corp: 34/1756b lim: 120 exec/s: 52 rss: 69Mb L: 76/76 MS: 4 InsertByte-CopyPart-CrossOver-InsertRepeatedBytes- 00:08:18.552 [2024-07-20 16:16:47.341649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8608480567731124087 len:30584 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.553 [2024-07-20 16:16:47.341679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.811 #53 NEW cov: 11842 ft: 15210 corp: 35/1792b lim: 120 exec/s: 53 rss: 69Mb L: 36/76 MS: 1 EraseBytes- 00:08:18.811 [2024-07-20 16:16:47.381979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489665746 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.382014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.811 [2024-07-20 16:16:47.382139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:211 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.382163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.811 #54 NEW cov: 11842 ft: 15228 corp: 36/1858b lim: 120 exec/s: 54 rss: 69Mb L: 66/76 MS: 1 ChangeBinInt- 00:08:18.811 [2024-07-20 16:16:47.422126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.422157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.811 [2024-07-20 16:16:47.422281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191435394052969170 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.422298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.811 #55 NEW cov: 11842 ft: 15230 corp: 37/1915b lim: 120 exec/s: 55 rss: 69Mb L: 57/76 MS: 1 ChangeBinInt- 00:08:18.811 [2024-07-20 16:16:47.462514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:6655295899584090716 len:23645 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.462546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.811 [2024-07-20 16:16:47.462587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436294016389842 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.462614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.811 [2024-07-20 16:16:47.462728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.462751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.811 #56 NEW cov: 11842 ft: 15253 corp: 38/1994b lim: 120 exec/s: 56 rss: 70Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:18.811 [2024-07-20 16:16:47.502389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.502424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.811 [2024-07-20 16:16:47.502544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436296046432978 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.502569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.811 #57 NEW cov: 11842 ft: 15262 corp: 39/2051b lim: 120 exec/s: 57 rss: 70Mb L: 57/79 MS: 1 ChangeBinInt- 00:08:18.811 [2024-07-20 16:16:47.542383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489665746 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.542417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.811 [2024-07-20 16:16:47.542530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53761 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.542551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.811 #58 NEW cov: 11842 ft: 15269 corp: 40/2118b lim: 120 exec/s: 58 rss: 70Mb L: 67/79 MS: 1 InsertByte- 00:08:18.811 [2024-07-20 16:16:47.582860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.582891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.811 [2024-07-20 16:16:47.582974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.582999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.811 [2024-07-20 16:16:47.583131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-07-20 16:16:47.583155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.811 #59 NEW cov: 11842 ft: 15281 corp: 41/2211b lim: 120 exec/s: 59 rss: 70Mb L: 93/93 MS: 1 CopyPart- 00:08:19.068 [2024-07-20 16:16:47.622543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436292489663186 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.068 [2024-07-20 16:16:47.622575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.068 [2024-07-20 16:16:47.622686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16128185018489164498 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.068 [2024-07-20 16:16:47.622710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.068 #60 NEW cov: 11842 ft: 15285 corp: 42/2269b lim: 120 exec/s: 30 rss: 70Mb L: 58/93 MS: 1 InsertByte- 00:08:19.068 #60 DONE cov: 11842 ft: 15285 corp: 42/2269b lim: 120 exec/s: 30 rss: 70Mb 00:08:19.068 Done 60 runs in 2 second(s) 00:08:19.068 16:16:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:19.068 16:16:47 -- ../common.sh@72 -- # (( i++ )) 00:08:19.068 16:16:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.068 16:16:47 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:19.068 16:16:47 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:19.068 16:16:47 -- nvmf/run.sh@24 -- # local timen=1 00:08:19.068 16:16:47 -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.068 16:16:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:19.068 16:16:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:19.068 16:16:47 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:19.068 16:16:47 -- nvmf/run.sh@29 -- # port=4418 00:08:19.068 16:16:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:19.068 16:16:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:19.068 16:16:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.068 16:16:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:19.068 [2024-07-20 16:16:47.787068] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:19.069 [2024-07-20 16:16:47.787138] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2277434 ] 00:08:19.069 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.326 [2024-07-20 16:16:47.960737] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.326 [2024-07-20 16:16:47.979771] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:19.326 [2024-07-20 16:16:47.979910] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.326 [2024-07-20 16:16:48.031328] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:19.326 [2024-07-20 16:16:48.047611] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:19.326 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.326 INFO: Seed: 2012113867 00:08:19.327 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:19.327 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:19.327 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:19.327 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.327 #2 INITED exec/s: 0 rss: 59Mb 00:08:19.327 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.327 This may also happen if the target rejected all inputs we tried so far 00:08:19.327 [2024-07-20 16:16:48.092303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.327 [2024-07-20 16:16:48.092337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.327 [2024-07-20 16:16:48.092371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.327 [2024-07-20 16:16:48.092388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.327 [2024-07-20 16:16:48.092416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.327 [2024-07-20 16:16:48.092431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.841 NEW_FUNC[1/670]: 0x4aedc0 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:19.841 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.841 #8 NEW cov: 11559 ft: 11560 corp: 2/62b lim: 100 exec/s: 0 rss: 66Mb L: 61/61 MS: 1 InsertRepeatedBytes- 00:08:19.841 [2024-07-20 16:16:48.423013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.841 [2024-07-20 16:16:48.423053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.841 #9 NEW cov: 11672 ft: 12383 corp: 3/95b lim: 100 exec/s: 0 rss: 67Mb L: 33/61 MS: 1 InsertRepeatedBytes- 00:08:19.841 [2024-07-20 16:16:48.473023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.841 [2024-07-20 16:16:48.473053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.841 [2024-07-20 16:16:48.473101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.841 [2024-07-20 16:16:48.473117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.841 #10 NEW cov: 11678 ft: 12988 corp: 4/153b lim: 100 exec/s: 0 rss: 67Mb L: 58/61 MS: 1 CrossOver- 00:08:19.841 [2024-07-20 16:16:48.533143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.841 [2024-07-20 16:16:48.533172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.841 #14 NEW cov: 11763 ft: 13282 corp: 5/173b lim: 100 exec/s: 0 rss: 67Mb L: 20/61 MS: 4 EraseBytes-CMP-EraseBytes-CopyPart- DE: "\014\000\000\000\000\000\000\000"- 00:08:19.841 [2024-07-20 16:16:48.583362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.841 [2024-07-20 16:16:48.583392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.841 [2024-07-20 16:16:48.583439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.841 [2024-07-20 16:16:48.583463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.841 [2024-07-20 16:16:48.583492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.841 [2024-07-20 16:16:48.583507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.841 #15 NEW cov: 11763 ft: 13454 corp: 6/244b lim: 100 exec/s: 0 rss: 67Mb L: 71/71 MS: 1 InsertRepeatedBytes- 00:08:19.841 [2024-07-20 16:16:48.643502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.841 [2024-07-20 16:16:48.643533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.098 #16 NEW cov: 11763 ft: 13550 corp: 7/276b lim: 100 exec/s: 0 rss: 67Mb L: 32/71 MS: 1 EraseBytes- 00:08:20.098 [2024-07-20 16:16:48.693592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.098 [2024-07-20 16:16:48.693620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.098 #22 NEW cov: 11763 ft: 13596 corp: 8/296b lim: 100 exec/s: 0 rss: 67Mb L: 20/71 MS: 1 ShuffleBytes- 00:08:20.098 [2024-07-20 16:16:48.753743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.098 [2024-07-20 16:16:48.753772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.098 [2024-07-20 16:16:48.753819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.098 [2024-07-20 16:16:48.753835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.098 #23 NEW cov: 11763 ft: 13607 corp: 9/337b lim: 100 exec/s: 0 rss: 68Mb L: 41/71 MS: 1 InsertRepeatedBytes- 00:08:20.098 [2024-07-20 16:16:48.813952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.098 [2024-07-20 16:16:48.813983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.098 [2024-07-20 16:16:48.814031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.098 [2024-07-20 16:16:48.814047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.098 #24 NEW cov: 11763 ft: 13644 corp: 10/378b lim: 100 exec/s: 0 rss: 68Mb L: 41/71 MS: 1 CrossOver- 00:08:20.098 [2024-07-20 16:16:48.874093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.098 [2024-07-20 16:16:48.874120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.098 [2024-07-20 16:16:48.874168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.098 [2024-07-20 16:16:48.874183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.355 #25 NEW cov: 11763 ft: 13685 corp: 11/436b lim: 100 exec/s: 0 rss: 68Mb L: 58/71 MS: 1 ChangeBit- 00:08:20.355 [2024-07-20 16:16:48.934263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.355 [2024-07-20 16:16:48.934290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.355 [2024-07-20 16:16:48.934336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.355 [2024-07-20 16:16:48.934351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.355 [2024-07-20 16:16:48.934380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.355 [2024-07-20 16:16:48.934395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.355 #26 NEW cov: 11763 ft: 13760 corp: 12/497b lim: 100 exec/s: 0 rss: 68Mb L: 61/71 MS: 1 ChangeByte- 00:08:20.355 [2024-07-20 16:16:48.984345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.355 [2024-07-20 16:16:48.984373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.355 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:20.355 #27 NEW cov: 11780 ft: 13808 corp: 13/529b lim: 100 exec/s: 0 rss: 68Mb L: 32/71 MS: 1 CMP- DE: "\377-\362r\353\236\204\246"- 00:08:20.355 [2024-07-20 16:16:49.034571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.355 [2024-07-20 16:16:49.034598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.355 [2024-07-20 16:16:49.034644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.355 [2024-07-20 16:16:49.034660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.355 [2024-07-20 16:16:49.034689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.355 [2024-07-20 16:16:49.034705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.355 [2024-07-20 16:16:49.034733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:20.355 [2024-07-20 16:16:49.034747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.355 #28 NEW cov: 11780 ft: 14095 corp: 14/611b lim: 100 exec/s: 28 rss: 68Mb L: 82/82 MS: 1 CrossOver- 00:08:20.355 [2024-07-20 16:16:49.094694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.355 [2024-07-20 16:16:49.094725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.355 [2024-07-20 16:16:49.094773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.355 [2024-07-20 16:16:49.094789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.355 #30 NEW cov: 11780 ft: 14128 corp: 15/661b lim: 100 exec/s: 30 rss: 68Mb L: 50/82 MS: 2 PersAutoDict-CrossOver- DE: "\377-\362r\353\236\204\246"- 00:08:20.355 [2024-07-20 16:16:49.144783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.355 [2024-07-20 16:16:49.144813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.355 [2024-07-20 16:16:49.144860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.355 [2024-07-20 16:16:49.144876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.613 #36 NEW cov: 11780 ft: 14150 corp: 16/702b lim: 100 exec/s: 36 rss: 68Mb L: 41/82 MS: 1 ChangeBit- 00:08:20.613 [2024-07-20 16:16:49.194944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.613 [2024-07-20 16:16:49.194972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.613 [2024-07-20 16:16:49.195019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.613 [2024-07-20 16:16:49.195035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.613 #38 NEW cov: 11780 ft: 14172 corp: 17/761b lim: 100 exec/s: 38 rss: 68Mb L: 59/82 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:20.613 [2024-07-20 16:16:49.245085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.613 [2024-07-20 16:16:49.245113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.613 [2024-07-20 16:16:49.245159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.613 [2024-07-20 16:16:49.245175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.613 [2024-07-20 16:16:49.245203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.613 [2024-07-20 16:16:49.245218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.613 #39 NEW cov: 11780 ft: 14186 corp: 18/822b lim: 100 exec/s: 39 rss: 68Mb L: 61/82 MS: 1 CopyPart- 00:08:20.613 [2024-07-20 16:16:49.295175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.613 [2024-07-20 16:16:49.295203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.613 [2024-07-20 16:16:49.295250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.613 [2024-07-20 16:16:49.295266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.613 #40 NEW cov: 11780 ft: 14232 corp: 19/872b lim: 100 exec/s: 40 rss: 68Mb L: 50/82 MS: 1 ChangeBinInt- 00:08:20.613 [2024-07-20 16:16:49.355342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.613 [2024-07-20 16:16:49.355371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.613 [2024-07-20 16:16:49.355419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.613 [2024-07-20 16:16:49.355439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.613 #41 NEW cov: 11780 ft: 14265 corp: 20/922b lim: 100 exec/s: 41 rss: 68Mb L: 50/82 MS: 1 ChangeBinInt- 00:08:20.613 [2024-07-20 16:16:49.415543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.613 [2024-07-20 16:16:49.415573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.613 [2024-07-20 16:16:49.415607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.613 [2024-07-20 16:16:49.415623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.870 #42 NEW cov: 11780 ft: 14287 corp: 21/962b lim: 100 exec/s: 42 rss: 69Mb L: 40/82 MS: 1 EraseBytes- 00:08:20.870 [2024-07-20 16:16:49.475720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.870 [2024-07-20 16:16:49.475749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.870 [2024-07-20 16:16:49.475796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.870 [2024-07-20 16:16:49.475811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.870 [2024-07-20 16:16:49.475839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.870 [2024-07-20 16:16:49.475854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.870 #43 NEW cov: 11780 ft: 14310 corp: 22/1039b lim: 100 exec/s: 43 rss: 69Mb L: 77/82 MS: 1 CopyPart- 00:08:20.870 [2024-07-20 16:16:49.525829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.870 [2024-07-20 16:16:49.525857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.870 [2024-07-20 16:16:49.525903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.870 [2024-07-20 16:16:49.525919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.870 [2024-07-20 16:16:49.525948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.870 [2024-07-20 16:16:49.525962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.870 #44 NEW cov: 11780 ft: 14341 corp: 23/1103b lim: 100 exec/s: 44 rss: 69Mb L: 64/82 MS: 1 CrossOver- 00:08:20.870 [2024-07-20 16:16:49.585943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.870 [2024-07-20 16:16:49.585970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.870 #48 NEW cov: 11780 ft: 14384 corp: 24/1141b lim: 100 exec/s: 48 rss: 69Mb L: 38/82 MS: 4 ChangeByte-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:08:20.870 [2024-07-20 16:16:49.636075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.870 [2024-07-20 16:16:49.636103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.870 #49 NEW cov: 11780 ft: 14393 corp: 25/1175b lim: 100 exec/s: 49 rss: 69Mb L: 34/82 MS: 1 InsertByte- 00:08:21.128 [2024-07-20 16:16:49.686212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.128 [2024-07-20 16:16:49.686240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.128 [2024-07-20 16:16:49.686287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:21.128 [2024-07-20 16:16:49.686306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.128 #50 NEW cov: 11780 ft: 14407 corp: 26/1234b lim: 100 exec/s: 50 rss: 69Mb L: 59/82 MS: 1 CopyPart- 00:08:21.128 [2024-07-20 16:16:49.746364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.128 [2024-07-20 16:16:49.746391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.128 [2024-07-20 16:16:49.746438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:21.128 [2024-07-20 16:16:49.746461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.128 #51 NEW cov: 11780 ft: 14441 corp: 27/1275b lim: 100 exec/s: 51 rss: 69Mb L: 41/82 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:08:21.128 [2024-07-20 16:16:49.806570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.128 [2024-07-20 16:16:49.806599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.128 [2024-07-20 16:16:49.806646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:21.128 [2024-07-20 16:16:49.806662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.128 #52 NEW cov: 11780 ft: 14446 corp: 28/1334b lim: 100 exec/s: 52 rss: 69Mb L: 59/82 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:08:21.128 [2024-07-20 16:16:49.856785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.128 [2024-07-20 16:16:49.856814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.128 [2024-07-20 16:16:49.856861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:21.128 [2024-07-20 16:16:49.856877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.128 [2024-07-20 16:16:49.856905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:21.129 [2024-07-20 16:16:49.856920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.129 [2024-07-20 16:16:49.856947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:21.129 [2024-07-20 16:16:49.856961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.129 #53 NEW cov: 11780 ft: 14464 corp: 29/1416b lim: 100 exec/s: 53 rss: 69Mb L: 82/82 MS: 1 InsertRepeatedBytes- 00:08:21.129 [2024-07-20 16:16:49.916819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.129 [2024-07-20 16:16:49.916847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.387 #54 NEW cov: 11780 ft: 14470 corp: 30/1449b lim: 100 exec/s: 54 rss: 69Mb L: 33/82 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:08:21.387 [2024-07-20 16:16:49.966987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.387 [2024-07-20 16:16:49.967016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.387 [2024-07-20 16:16:49.967063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:21.387 [2024-07-20 16:16:49.967079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.387 #55 NEW cov: 11787 ft: 14510 corp: 31/1490b lim: 100 exec/s: 55 rss: 69Mb L: 41/82 MS: 1 ChangeBinInt- 00:08:21.387 [2024-07-20 16:16:50.017158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.387 [2024-07-20 16:16:50.017189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.387 [2024-07-20 16:16:50.017223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:21.387 [2024-07-20 16:16:50.017239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.387 #56 NEW cov: 11787 ft: 14561 corp: 32/1540b lim: 100 exec/s: 56 rss: 69Mb L: 50/82 MS: 1 ChangeBinInt- 00:08:21.387 [2024-07-20 16:16:50.067258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.387 [2024-07-20 16:16:50.067289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.387 #61 NEW cov: 11787 ft: 14569 corp: 33/1564b lim: 100 exec/s: 30 rss: 69Mb L: 24/82 MS: 5 EraseBytes-ChangeByte-ChangeBinInt-ChangeByte-CrossOver- 00:08:21.387 #61 DONE cov: 11787 ft: 14569 corp: 33/1564b lim: 100 exec/s: 30 rss: 69Mb 00:08:21.388 ###### Recommended dictionary. ###### 00:08:21.388 "\014\000\000\000\000\000\000\000" # Uses: 4 00:08:21.388 "\377-\362r\353\236\204\246" # Uses: 1 00:08:21.388 ###### End of recommended dictionary. ###### 00:08:21.388 Done 61 runs in 2 second(s) 00:08:21.646 16:16:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:21.647 16:16:50 -- ../common.sh@72 -- # (( i++ )) 00:08:21.647 16:16:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.647 16:16:50 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:21.647 16:16:50 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:21.647 16:16:50 -- nvmf/run.sh@24 -- # local timen=1 00:08:21.647 16:16:50 -- nvmf/run.sh@25 -- # local core=0x1 00:08:21.647 16:16:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:21.647 16:16:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:21.647 16:16:50 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:21.647 16:16:50 -- nvmf/run.sh@29 -- # port=4419 00:08:21.647 16:16:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:21.647 16:16:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:21.647 16:16:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:21.647 16:16:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:21.647 [2024-07-20 16:16:50.268869] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:21.647 [2024-07-20 16:16:50.268963] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2277944 ] 00:08:21.647 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.647 [2024-07-20 16:16:50.444598] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.907 [2024-07-20 16:16:50.464565] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:21.907 [2024-07-20 16:16:50.464684] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.907 [2024-07-20 16:16:50.516037] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.907 [2024-07-20 16:16:50.532357] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:21.907 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.907 INFO: Seed: 201150548 00:08:21.907 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:21.907 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:21.907 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:21.907 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.907 #2 INITED exec/s: 0 rss: 59Mb 00:08:21.907 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.907 This may also happen if the target rejected all inputs we tried so far 00:08:21.907 [2024-07-20 16:16:50.576985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:21.907 [2024-07-20 16:16:50.577021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.907 [2024-07-20 16:16:50.577071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:21.907 [2024-07-20 16:16:50.577090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.907 [2024-07-20 16:16:50.577118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:21.907 [2024-07-20 16:16:50.577135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.166 NEW_FUNC[1/670]: 0x4b1d80 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:22.166 NEW_FUNC[2/670]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:22.166 #4 NEW cov: 11537 ft: 11538 corp: 2/32b lim: 50 exec/s: 0 rss: 66Mb L: 31/31 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:22.166 [2024-07-20 16:16:50.897814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:12019 00:08:22.166 [2024-07-20 16:16:50.897855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.166 [2024-07-20 16:16:50.897891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12465963766218815792 len:65536 00:08:22.166 [2024-07-20 16:16:50.897909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.166 [2024-07-20 16:16:50.897938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.166 [2024-07-20 16:16:50.897955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.167 #10 NEW cov: 11650 ft: 12088 corp: 3/71b lim: 50 exec/s: 0 rss: 66Mb L: 39/39 MS: 1 CMP- DE: "\001.\362t\\\3710\254"- 00:08:22.167 [2024-07-20 16:16:50.957836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:22.167 [2024-07-20 16:16:50.957869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.167 [2024-07-20 16:16:50.957916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:22.167 [2024-07-20 16:16:50.957933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.167 [2024-07-20 16:16:50.957962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18377782704415440895 len:65536 00:08:22.167 [2024-07-20 16:16:50.957979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.167 [2024-07-20 16:16:50.958006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:22.167 [2024-07-20 16:16:50.958022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.426 #16 NEW cov: 11656 ft: 12616 corp: 4/112b lim: 50 exec/s: 0 rss: 66Mb L: 41/41 MS: 1 CrossOver- 00:08:22.426 [2024-07-20 16:16:51.007891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.426 [2024-07-20 16:16:51.007921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.007968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:22.426 [2024-07-20 16:16:51.007985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.008014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.426 [2024-07-20 16:16:51.008031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.426 #18 NEW cov: 11741 ft: 12866 corp: 5/144b lim: 50 exec/s: 0 rss: 66Mb L: 32/41 MS: 2 ShuffleBytes-CrossOver- 00:08:22.426 [2024-07-20 16:16:51.058039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.426 [2024-07-20 16:16:51.058068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.058114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1024 00:08:22.426 [2024-07-20 16:16:51.058131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.058160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.426 [2024-07-20 16:16:51.058177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.426 #19 NEW cov: 11741 ft: 13023 corp: 6/176b lim: 50 exec/s: 0 rss: 66Mb L: 32/41 MS: 1 ChangeBinInt- 00:08:22.426 [2024-07-20 16:16:51.118228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:12019 00:08:22.426 [2024-07-20 16:16:51.118257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.118304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18279266460223994160 len:65536 00:08:22.426 [2024-07-20 16:16:51.118321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.118350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.426 [2024-07-20 16:16:51.118366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.118394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65291 00:08:22.426 [2024-07-20 16:16:51.118410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.426 #20 NEW cov: 11741 ft: 13116 corp: 7/216b lim: 50 exec/s: 0 rss: 66Mb L: 40/41 MS: 1 InsertByte- 00:08:22.426 [2024-07-20 16:16:51.178408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.426 [2024-07-20 16:16:51.178438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.178480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:2816 00:08:22.426 [2024-07-20 16:16:51.178498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.426 [2024-07-20 16:16:51.178527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.426 [2024-07-20 16:16:51.178543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.426 #21 NEW cov: 11741 ft: 13181 corp: 8/248b lim: 50 exec/s: 0 rss: 67Mb L: 32/41 MS: 1 CrossOver- 00:08:22.686 [2024-07-20 16:16:51.238564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.686 [2024-07-20 16:16:51.238595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.238627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:22.686 [2024-07-20 16:16:51.238645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.238674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4251398048237748223 len:65536 00:08:22.686 [2024-07-20 16:16:51.238691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.686 #22 NEW cov: 11741 ft: 13251 corp: 9/280b lim: 50 exec/s: 0 rss: 67Mb L: 32/41 MS: 1 ChangeByte- 00:08:22.686 [2024-07-20 16:16:51.288659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:12019 00:08:22.686 [2024-07-20 16:16:51.288688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.288735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12465963765145073968 len:65536 00:08:22.686 [2024-07-20 16:16:51.288752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.288781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.686 [2024-07-20 16:16:51.288797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.686 #23 NEW cov: 11741 ft: 13295 corp: 10/319b lim: 50 exec/s: 0 rss: 67Mb L: 39/41 MS: 1 ChangeBit- 00:08:22.686 [2024-07-20 16:16:51.338764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.686 [2024-07-20 16:16:51.338795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.338842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:22.686 [2024-07-20 16:16:51.338860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.338889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073692971263 len:65536 00:08:22.686 [2024-07-20 16:16:51.338905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.686 #24 NEW cov: 11741 ft: 13339 corp: 11/351b lim: 50 exec/s: 0 rss: 67Mb L: 32/41 MS: 1 ChangeBinInt- 00:08:22.686 [2024-07-20 16:16:51.388989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:65536 00:08:22.686 [2024-07-20 16:16:51.389025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.389059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446514219944771583 len:13405 00:08:22.686 [2024-07-20 16:16:51.389077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.389105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073595301119 len:65536 00:08:22.686 [2024-07-20 16:16:51.389122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.389149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:22.686 [2024-07-20 16:16:51.389166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.686 #25 NEW cov: 11741 ft: 13383 corp: 12/398b lim: 50 exec/s: 0 rss: 67Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:22.686 [2024-07-20 16:16:51.449070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.686 [2024-07-20 16:16:51.449100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.449147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709533695 len:1024 00:08:22.686 [2024-07-20 16:16:51.449164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.686 [2024-07-20 16:16:51.449193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.686 [2024-07-20 16:16:51.449209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.686 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:22.686 #26 NEW cov: 11758 ft: 13478 corp: 13/430b lim: 50 exec/s: 0 rss: 67Mb L: 32/47 MS: 1 ChangeByte- 00:08:22.945 [2024-07-20 16:16:51.499275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.945 [2024-07-20 16:16:51.499306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.945 [2024-07-20 16:16:51.499338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:22.946 [2024-07-20 16:16:51.499355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.499383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4251398048237748223 len:65282 00:08:22.946 [2024-07-20 16:16:51.499400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.499429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:17956041926330446940 len:65291 00:08:22.946 [2024-07-20 16:16:51.499452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.946 #27 NEW cov: 11758 ft: 13509 corp: 14/470b lim: 50 exec/s: 0 rss: 67Mb L: 40/47 MS: 1 PersAutoDict- DE: "\001.\362t\\\3710\254"- 00:08:22.946 [2024-07-20 16:16:51.559388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:864690033423024127 len:12019 00:08:22.946 [2024-07-20 16:16:51.559423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.559463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18279266460223994160 len:65536 00:08:22.946 [2024-07-20 16:16:51.559481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.559510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.946 [2024-07-20 16:16:51.559526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.559553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65291 00:08:22.946 [2024-07-20 16:16:51.559569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.946 #28 NEW cov: 11758 ft: 13557 corp: 15/510b lim: 50 exec/s: 28 rss: 67Mb L: 40/47 MS: 1 ChangeBinInt- 00:08:22.946 [2024-07-20 16:16:51.619493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.946 [2024-07-20 16:16:51.619523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.619570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446466996779334143 len:65536 00:08:22.946 [2024-07-20 16:16:51.619589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.619618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.946 [2024-07-20 16:16:51.619635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.946 #29 NEW cov: 11758 ft: 13581 corp: 16/542b lim: 50 exec/s: 29 rss: 67Mb L: 32/47 MS: 1 CopyPart- 00:08:22.946 [2024-07-20 16:16:51.679606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.946 [2024-07-20 16:16:51.679636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.679683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:22.946 [2024-07-20 16:16:51.679701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.946 #30 NEW cov: 11758 ft: 13892 corp: 17/567b lim: 50 exec/s: 30 rss: 67Mb L: 25/47 MS: 1 EraseBytes- 00:08:22.946 [2024-07-20 16:16:51.739837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:22.946 [2024-07-20 16:16:51.739868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.739915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:9 00:08:22.946 [2024-07-20 16:16:51.739932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.946 [2024-07-20 16:16:51.739961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.946 [2024-07-20 16:16:51.739977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.206 #31 NEW cov: 11758 ft: 13911 corp: 18/599b lim: 50 exec/s: 31 rss: 68Mb L: 32/47 MS: 1 CMP- DE: "\000\010"- 00:08:23.206 [2024-07-20 16:16:51.800080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:65536 00:08:23.206 [2024-07-20 16:16:51.800112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.800144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446514219944771583 len:13405 00:08:23.206 [2024-07-20 16:16:51.800161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.800190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073595301119 len:65380 00:08:23.206 [2024-07-20 16:16:51.800206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.800234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:23.206 [2024-07-20 16:16:51.800249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.206 #32 NEW cov: 11758 ft: 14010 corp: 19/647b lim: 50 exec/s: 32 rss: 68Mb L: 48/48 MS: 1 InsertByte- 00:08:23.206 [2024-07-20 16:16:51.860193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:12019 00:08:23.206 [2024-07-20 16:16:51.860223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.860269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18279266460223994160 len:65536 00:08:23.206 [2024-07-20 16:16:51.860286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.860315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:23.206 [2024-07-20 16:16:51.860331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.860358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65307 00:08:23.206 [2024-07-20 16:16:51.860374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.206 #33 NEW cov: 11758 ft: 14077 corp: 20/687b lim: 50 exec/s: 33 rss: 68Mb L: 40/48 MS: 1 ChangeBit- 00:08:23.206 [2024-07-20 16:16:51.910330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:12019 00:08:23.206 [2024-07-20 16:16:51.910360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.910404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12465963766218815792 len:65536 00:08:23.206 [2024-07-20 16:16:51.910421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.910457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:23.206 [2024-07-20 16:16:51.910474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.910501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:23.206 [2024-07-20 16:16:51.910517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.206 #34 NEW cov: 11758 ft: 14097 corp: 21/735b lim: 50 exec/s: 34 rss: 68Mb L: 48/48 MS: 1 CopyPart- 00:08:23.206 [2024-07-20 16:16:51.960466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:65536 00:08:23.206 [2024-07-20 16:16:51.960496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.960542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446514219944771583 len:13405 00:08:23.206 [2024-07-20 16:16:51.960559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.960588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073595301119 len:65536 00:08:23.206 [2024-07-20 16:16:51.960604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.206 [2024-07-20 16:16:51.960631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:23.206 [2024-07-20 16:16:51.960647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.206 #35 NEW cov: 11758 ft: 14116 corp: 22/782b lim: 50 exec/s: 35 rss: 68Mb L: 47/48 MS: 1 ShuffleBytes- 00:08:23.466 [2024-07-20 16:16:52.010568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:23.466 [2024-07-20 16:16:52.010599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.010633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073693432575 len:65466 00:08:23.466 [2024-07-20 16:16:52.010651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.466 #36 NEW cov: 11758 ft: 14130 corp: 23/802b lim: 50 exec/s: 36 rss: 68Mb L: 20/48 MS: 1 CrossOver- 00:08:23.466 [2024-07-20 16:16:52.070719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:23.466 [2024-07-20 16:16:52.070748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.070794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446463900095021055 len:62069 00:08:23.466 [2024-07-20 16:16:52.070811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.070839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744070974419116 len:65536 00:08:23.466 [2024-07-20 16:16:52.070856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.466 #37 NEW cov: 11758 ft: 14151 corp: 24/834b lim: 50 exec/s: 37 rss: 68Mb L: 32/48 MS: 1 CrossOver- 00:08:23.466 [2024-07-20 16:16:52.120774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:23.466 [2024-07-20 16:16:52.120803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.120851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:23.466 [2024-07-20 16:16:52.120868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.466 #38 NEW cov: 11758 ft: 14232 corp: 25/859b lim: 50 exec/s: 38 rss: 68Mb L: 25/48 MS: 1 ChangeBit- 00:08:23.466 [2024-07-20 16:16:52.181043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:23.466 [2024-07-20 16:16:52.181073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.181118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:23.466 [2024-07-20 16:16:52.181135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.181163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:9 00:08:23.466 [2024-07-20 16:16:52.181179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.181206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:23.466 [2024-07-20 16:16:52.181222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.466 #39 NEW cov: 11758 ft: 14247 corp: 26/901b lim: 50 exec/s: 39 rss: 68Mb L: 42/48 MS: 1 CopyPart- 00:08:23.466 [2024-07-20 16:16:52.241195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:12019 00:08:23.466 [2024-07-20 16:16:52.241225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.241255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12465963765145073968 len:65536 00:08:23.466 [2024-07-20 16:16:52.241287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.241316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18389098001915183103 len:13108 00:08:23.466 [2024-07-20 16:16:52.241332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.466 [2024-07-20 16:16:52.241359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:23.466 [2024-07-20 16:16:52.241375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.725 #40 NEW cov: 11758 ft: 14286 corp: 27/945b lim: 50 exec/s: 40 rss: 68Mb L: 44/48 MS: 1 InsertRepeatedBytes- 00:08:23.725 [2024-07-20 16:16:52.291288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:12019 00:08:23.725 [2024-07-20 16:16:52.291317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.725 [2024-07-20 16:16:52.291362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12465963765145073968 len:65536 00:08:23.725 [2024-07-20 16:16:52.291379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.725 [2024-07-20 16:16:52.291408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:23.725 [2024-07-20 16:16:52.291424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.725 #41 NEW cov: 11758 ft: 14293 corp: 28/984b lim: 50 exec/s: 41 rss: 68Mb L: 39/48 MS: 1 CopyPart- 00:08:23.725 [2024-07-20 16:16:52.341339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:23.725 [2024-07-20 16:16:52.341373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.725 [2024-07-20 16:16:52.341420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073697624063 len:65536 00:08:23.725 [2024-07-20 16:16:52.341437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.725 #47 NEW cov: 11758 ft: 14308 corp: 29/1010b lim: 50 exec/s: 47 rss: 68Mb L: 26/48 MS: 1 InsertByte- 00:08:23.726 [2024-07-20 16:16:52.401651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:12019 00:08:23.726 [2024-07-20 16:16:52.401680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.401724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12465682329896810800 len:65536 00:08:23.726 [2024-07-20 16:16:52.401741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.401770] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:23.726 [2024-07-20 16:16:52.401787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.401813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:23.726 [2024-07-20 16:16:52.401829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.726 #48 NEW cov: 11758 ft: 14325 corp: 30/1051b lim: 50 exec/s: 48 rss: 68Mb L: 41/48 MS: 1 PersAutoDict- DE: "\000\010"- 00:08:23.726 [2024-07-20 16:16:52.451738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:23.726 [2024-07-20 16:16:52.451766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.451812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:11008 00:08:23.726 [2024-07-20 16:16:52.451829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.451858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4251398048237748223 len:65282 00:08:23.726 [2024-07-20 16:16:52.451874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.451901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:17956041926330446940 len:65291 00:08:23.726 [2024-07-20 16:16:52.451917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.726 #49 NEW cov: 11765 ft: 14342 corp: 31/1091b lim: 50 exec/s: 49 rss: 68Mb L: 40/48 MS: 1 ChangeByte- 00:08:23.726 [2024-07-20 16:16:52.511897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:2816 00:08:23.726 [2024-07-20 16:16:52.511926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.511971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:23.726 [2024-07-20 16:16:52.511988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.512021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:23.726 [2024-07-20 16:16:52.512037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.726 [2024-07-20 16:16:52.512064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18374696375276273663 len:65536 00:08:23.726 [2024-07-20 16:16:52.512080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.985 #50 NEW cov: 11765 ft: 14347 corp: 32/1140b lim: 50 exec/s: 50 rss: 69Mb L: 49/49 MS: 1 CopyPart- 00:08:23.985 [2024-07-20 16:16:52.572053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742978677440511 len:11987 00:08:23.985 [2024-07-20 16:16:52.572083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.985 [2024-07-20 16:16:52.572128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12465963766218815792 len:65536 00:08:23.985 [2024-07-20 16:16:52.572145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.985 [2024-07-20 16:16:52.572174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:23.985 [2024-07-20 16:16:52.572190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.985 [2024-07-20 16:16:52.572217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:23.985 [2024-07-20 16:16:52.572233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.985 #51 NEW cov: 11765 ft: 14368 corp: 33/1188b lim: 50 exec/s: 25 rss: 69Mb L: 48/49 MS: 1 ChangeBit- 00:08:23.985 #51 DONE cov: 11765 ft: 14368 corp: 33/1188b lim: 50 exec/s: 25 rss: 69Mb 00:08:23.985 ###### Recommended dictionary. ###### 00:08:23.985 "\001.\362t\\\3710\254" # Uses: 1 00:08:23.985 "\000\010" # Uses: 1 00:08:23.985 ###### End of recommended dictionary. ###### 00:08:23.985 Done 51 runs in 2 second(s) 00:08:23.985 16:16:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:23.985 16:16:52 -- ../common.sh@72 -- # (( i++ )) 00:08:23.985 16:16:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.985 16:16:52 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:23.985 16:16:52 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:23.985 16:16:52 -- nvmf/run.sh@24 -- # local timen=1 00:08:23.985 16:16:52 -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.985 16:16:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.985 16:16:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:23.985 16:16:52 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:23.985 16:16:52 -- nvmf/run.sh@29 -- # port=4420 00:08:23.985 16:16:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.985 16:16:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:23.985 16:16:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.985 16:16:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:23.985 [2024-07-20 16:16:52.767500] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:23.985 [2024-07-20 16:16:52.767571] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2278267 ] 00:08:24.246 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.246 [2024-07-20 16:16:52.943406] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.246 [2024-07-20 16:16:52.963207] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:24.246 [2024-07-20 16:16:52.963346] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.246 [2024-07-20 16:16:53.014807] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:24.246 [2024-07-20 16:16:53.031127] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:24.246 INFO: Running with entropic power schedule (0xFF, 100). 00:08:24.246 INFO: Seed: 2698169929 00:08:24.506 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:24.506 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:24.506 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:24.506 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.506 #2 INITED exec/s: 0 rss: 60Mb 00:08:24.506 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.506 This may also happen if the target rejected all inputs we tried so far 00:08:24.506 [2024-07-20 16:16:53.100471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.506 [2024-07-20 16:16:53.100511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.764 NEW_FUNC[1/672]: 0x4b3940 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:24.764 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:24.764 #7 NEW cov: 11592 ft: 11596 corp: 2/19b lim: 90 exec/s: 0 rss: 66Mb L: 18/18 MS: 5 InsertByte-ChangeBinInt-ChangeBit-InsertRepeatedBytes-InsertByte- 00:08:24.764 [2024-07-20 16:16:53.442092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.764 [2024-07-20 16:16:53.442140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.764 [2024-07-20 16:16:53.442274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.764 [2024-07-20 16:16:53.442304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.764 [2024-07-20 16:16:53.442434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.764 [2024-07-20 16:16:53.442471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.764 [2024-07-20 16:16:53.442603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.764 [2024-07-20 16:16:53.442634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.764 #11 NEW cov: 11708 ft: 13030 corp: 3/92b lim: 90 exec/s: 0 rss: 66Mb L: 73/73 MS: 4 CMP-ChangeBit-CMP-InsertRepeatedBytes- DE: "\001\000\000\000\000\000\000\000"-"\015\000\000\000"- 00:08:24.764 [2024-07-20 16:16:53.482086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.764 [2024-07-20 16:16:53.482124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.764 [2024-07-20 16:16:53.482208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.764 [2024-07-20 16:16:53.482231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.764 [2024-07-20 16:16:53.482357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.764 [2024-07-20 16:16:53.482379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.764 [2024-07-20 16:16:53.482496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.764 [2024-07-20 16:16:53.482518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.764 #12 NEW cov: 11714 ft: 13325 corp: 4/165b lim: 90 exec/s: 0 rss: 66Mb L: 73/73 MS: 1 CopyPart- 00:08:24.764 [2024-07-20 16:16:53.531616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.764 [2024-07-20 16:16:53.531643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.764 #17 NEW cov: 11799 ft: 13657 corp: 5/183b lim: 90 exec/s: 0 rss: 67Mb L: 18/73 MS: 5 EraseBytes-ChangeByte-PersAutoDict-ShuffleBytes-InsertRepeatedBytes- DE: "\015\000\000\000"- 00:08:25.022 [2024-07-20 16:16:53.582455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.022 [2024-07-20 16:16:53.582503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.022 [2024-07-20 16:16:53.582594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.022 [2024-07-20 16:16:53.582618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.022 [2024-07-20 16:16:53.582736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.022 [2024-07-20 16:16:53.582759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.022 [2024-07-20 16:16:53.582890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.022 [2024-07-20 16:16:53.582913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.022 #18 NEW cov: 11799 ft: 13746 corp: 6/256b lim: 90 exec/s: 0 rss: 67Mb L: 73/73 MS: 1 ChangeByte- 00:08:25.022 [2024-07-20 16:16:53.621914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.022 [2024-07-20 16:16:53.621942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.022 #24 NEW cov: 11799 ft: 13840 corp: 7/291b lim: 90 exec/s: 0 rss: 67Mb L: 35/73 MS: 1 CopyPart- 00:08:25.022 [2024-07-20 16:16:53.662004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.022 [2024-07-20 16:16:53.662030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.022 #25 NEW cov: 11799 ft: 13884 corp: 8/326b lim: 90 exec/s: 0 rss: 67Mb L: 35/73 MS: 1 CMP- DE: "\000\002\000\000"- 00:08:25.022 [2024-07-20 16:16:53.702086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.022 [2024-07-20 16:16:53.702113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.022 #26 NEW cov: 11799 ft: 13923 corp: 9/348b lim: 90 exec/s: 0 rss: 67Mb L: 22/73 MS: 1 PersAutoDict- DE: "\015\000\000\000"- 00:08:25.022 [2024-07-20 16:16:53.742250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.022 [2024-07-20 16:16:53.742277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.022 #32 NEW cov: 11799 ft: 13934 corp: 10/366b lim: 90 exec/s: 0 rss: 67Mb L: 18/73 MS: 1 ChangeByte- 00:08:25.022 [2024-07-20 16:16:53.782314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.022 [2024-07-20 16:16:53.782344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.022 #33 NEW cov: 11799 ft: 13962 corp: 11/401b lim: 90 exec/s: 0 rss: 67Mb L: 35/73 MS: 1 CopyPart- 00:08:25.022 [2024-07-20 16:16:53.822537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.022 [2024-07-20 16:16:53.822564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.280 #34 NEW cov: 11799 ft: 14029 corp: 12/423b lim: 90 exec/s: 0 rss: 67Mb L: 22/73 MS: 1 ShuffleBytes- 00:08:25.280 [2024-07-20 16:16:53.862638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.280 [2024-07-20 16:16:53.862665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.280 #35 NEW cov: 11799 ft: 14102 corp: 13/458b lim: 90 exec/s: 0 rss: 67Mb L: 35/73 MS: 1 CrossOver- 00:08:25.280 [2024-07-20 16:16:53.902754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.280 [2024-07-20 16:16:53.902781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.280 #36 NEW cov: 11799 ft: 14108 corp: 14/480b lim: 90 exec/s: 0 rss: 67Mb L: 22/73 MS: 1 ChangeByte- 00:08:25.280 [2024-07-20 16:16:53.942768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.280 [2024-07-20 16:16:53.942796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.280 #37 NEW cov: 11799 ft: 14192 corp: 15/515b lim: 90 exec/s: 0 rss: 67Mb L: 35/73 MS: 1 PersAutoDict- DE: "\000\002\000\000"- 00:08:25.280 [2024-07-20 16:16:53.982951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.280 [2024-07-20 16:16:53.982981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.280 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:25.280 #38 NEW cov: 11822 ft: 14244 corp: 16/550b lim: 90 exec/s: 0 rss: 67Mb L: 35/73 MS: 1 ChangeBit- 00:08:25.280 [2024-07-20 16:16:54.033108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.281 [2024-07-20 16:16:54.033136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.281 #39 NEW cov: 11822 ft: 14266 corp: 17/585b lim: 90 exec/s: 0 rss: 67Mb L: 35/73 MS: 1 PersAutoDict- DE: "\015\000\000\000"- 00:08:25.281 [2024-07-20 16:16:54.073299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.281 [2024-07-20 16:16:54.073326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.539 #40 NEW cov: 11822 ft: 14284 corp: 18/608b lim: 90 exec/s: 40 rss: 68Mb L: 23/73 MS: 1 InsertByte- 00:08:25.539 [2024-07-20 16:16:54.113436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.539 [2024-07-20 16:16:54.113467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.539 #41 NEW cov: 11822 ft: 14293 corp: 19/626b lim: 90 exec/s: 41 rss: 68Mb L: 18/73 MS: 1 EraseBytes- 00:08:25.539 [2024-07-20 16:16:54.153554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.539 [2024-07-20 16:16:54.153581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.539 #42 NEW cov: 11822 ft: 14305 corp: 20/649b lim: 90 exec/s: 42 rss: 68Mb L: 23/73 MS: 1 ChangeByte- 00:08:25.539 [2024-07-20 16:16:54.193677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.539 [2024-07-20 16:16:54.193703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.539 #43 NEW cov: 11822 ft: 14317 corp: 21/672b lim: 90 exec/s: 43 rss: 68Mb L: 23/73 MS: 1 InsertByte- 00:08:25.539 [2024-07-20 16:16:54.233975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.539 [2024-07-20 16:16:54.234006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.539 [2024-07-20 16:16:54.234122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.539 [2024-07-20 16:16:54.234154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.539 #44 NEW cov: 11822 ft: 14690 corp: 22/708b lim: 90 exec/s: 44 rss: 68Mb L: 36/73 MS: 1 InsertByte- 00:08:25.539 [2024-07-20 16:16:54.273885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.539 [2024-07-20 16:16:54.273923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.539 #45 NEW cov: 11822 ft: 14703 corp: 23/727b lim: 90 exec/s: 45 rss: 68Mb L: 19/73 MS: 1 InsertByte- 00:08:25.539 [2024-07-20 16:16:54.314472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.539 [2024-07-20 16:16:54.314505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.539 [2024-07-20 16:16:54.314598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.539 [2024-07-20 16:16:54.314624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.539 [2024-07-20 16:16:54.314744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.539 [2024-07-20 16:16:54.314770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.539 #46 NEW cov: 11822 ft: 14996 corp: 24/784b lim: 90 exec/s: 46 rss: 68Mb L: 57/73 MS: 1 CrossOver- 00:08:25.797 [2024-07-20 16:16:54.354371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.797 [2024-07-20 16:16:54.354399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.797 [2024-07-20 16:16:54.354518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.797 [2024-07-20 16:16:54.354543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.797 #47 NEW cov: 11822 ft: 15031 corp: 25/820b lim: 90 exec/s: 47 rss: 68Mb L: 36/73 MS: 1 ChangeBinInt- 00:08:25.797 [2024-07-20 16:16:54.394191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.797 [2024-07-20 16:16:54.394218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.797 #48 NEW cov: 11822 ft: 15113 corp: 26/855b lim: 90 exec/s: 48 rss: 68Mb L: 35/73 MS: 1 ChangeBinInt- 00:08:25.797 [2024-07-20 16:16:54.434395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.797 [2024-07-20 16:16:54.434422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.797 #49 NEW cov: 11822 ft: 15144 corp: 27/879b lim: 90 exec/s: 49 rss: 68Mb L: 24/73 MS: 1 InsertByte- 00:08:25.797 [2024-07-20 16:16:54.474519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.797 [2024-07-20 16:16:54.474550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.797 #50 NEW cov: 11822 ft: 15154 corp: 28/910b lim: 90 exec/s: 50 rss: 68Mb L: 31/73 MS: 1 CrossOver- 00:08:25.797 [2024-07-20 16:16:54.514639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.797 [2024-07-20 16:16:54.514666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.797 [2024-07-20 16:16:54.554810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.797 [2024-07-20 16:16:54.554842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.797 #52 NEW cov: 11822 ft: 15221 corp: 29/945b lim: 90 exec/s: 52 rss: 68Mb L: 35/73 MS: 2 ChangeBinInt-ChangeByte- 00:08:25.797 [2024-07-20 16:16:54.594852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.797 [2024-07-20 16:16:54.594879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.055 #53 NEW cov: 11822 ft: 15281 corp: 30/969b lim: 90 exec/s: 53 rss: 68Mb L: 24/73 MS: 1 InsertByte- 00:08:26.055 [2024-07-20 16:16:54.635006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.055 [2024-07-20 16:16:54.635033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.055 #54 NEW cov: 11822 ft: 15312 corp: 31/992b lim: 90 exec/s: 54 rss: 68Mb L: 23/73 MS: 1 InsertByte- 00:08:26.055 [2024-07-20 16:16:54.675395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.055 [2024-07-20 16:16:54.675424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.055 [2024-07-20 16:16:54.675493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.055 [2024-07-20 16:16:54.675517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.055 #55 NEW cov: 11822 ft: 15342 corp: 32/1029b lim: 90 exec/s: 55 rss: 68Mb L: 37/73 MS: 1 InsertByte- 00:08:26.056 [2024-07-20 16:16:54.715266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.056 [2024-07-20 16:16:54.715294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.056 #56 NEW cov: 11822 ft: 15360 corp: 33/1064b lim: 90 exec/s: 56 rss: 68Mb L: 35/73 MS: 1 CrossOver- 00:08:26.056 [2024-07-20 16:16:54.755246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.056 [2024-07-20 16:16:54.755274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.056 #57 NEW cov: 11822 ft: 15383 corp: 34/1087b lim: 90 exec/s: 57 rss: 69Mb L: 23/73 MS: 1 ChangeByte- 00:08:26.056 [2024-07-20 16:16:54.795956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.056 [2024-07-20 16:16:54.795990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.056 [2024-07-20 16:16:54.796103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.056 [2024-07-20 16:16:54.796132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.056 [2024-07-20 16:16:54.796259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.056 [2024-07-20 16:16:54.796284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.056 #58 NEW cov: 11822 ft: 15405 corp: 35/1151b lim: 90 exec/s: 58 rss: 69Mb L: 64/73 MS: 1 InsertRepeatedBytes- 00:08:26.056 [2024-07-20 16:16:54.845668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.056 [2024-07-20 16:16:54.845721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.313 #59 NEW cov: 11822 ft: 15417 corp: 36/1174b lim: 90 exec/s: 59 rss: 69Mb L: 23/73 MS: 1 ChangeByte- 00:08:26.313 [2024-07-20 16:16:54.885312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.313 [2024-07-20 16:16:54.885347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.313 #60 NEW cov: 11822 ft: 15477 corp: 37/1209b lim: 90 exec/s: 60 rss: 69Mb L: 35/73 MS: 1 CopyPart- 00:08:26.313 [2024-07-20 16:16:54.926031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.313 [2024-07-20 16:16:54.926063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.313 [2024-07-20 16:16:54.926183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.313 [2024-07-20 16:16:54.926207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.313 #61 NEW cov: 11822 ft: 15484 corp: 38/1245b lim: 90 exec/s: 61 rss: 69Mb L: 36/73 MS: 1 InsertByte- 00:08:26.313 [2024-07-20 16:16:54.966625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.313 [2024-07-20 16:16:54.966660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.313 [2024-07-20 16:16:54.966756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.313 [2024-07-20 16:16:54.966780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.313 [2024-07-20 16:16:54.966896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.313 [2024-07-20 16:16:54.966920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.313 [2024-07-20 16:16:54.967046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:26.313 [2024-07-20 16:16:54.967073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.313 #62 NEW cov: 11822 ft: 15494 corp: 39/1326b lim: 90 exec/s: 62 rss: 69Mb L: 81/81 MS: 1 CMP- DE: "\001.\362vc8@ "- 00:08:26.313 [2024-07-20 16:16:55.016079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.313 [2024-07-20 16:16:55.016106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.313 #63 NEW cov: 11822 ft: 15510 corp: 40/1349b lim: 90 exec/s: 63 rss: 69Mb L: 23/81 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:26.313 [2024-07-20 16:16:55.056688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.313 [2024-07-20 16:16:55.056726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.313 [2024-07-20 16:16:55.056839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.313 [2024-07-20 16:16:55.056868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.313 [2024-07-20 16:16:55.056988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.313 [2024-07-20 16:16:55.057018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.313 #64 pulse cov: 11822 ft: 15517 corp: 40/1349b lim: 90 exec/s: 32 rss: 69Mb 00:08:26.313 #64 NEW cov: 11822 ft: 15517 corp: 41/1408b lim: 90 exec/s: 32 rss: 69Mb L: 59/81 MS: 1 EraseBytes- 00:08:26.313 #64 DONE cov: 11822 ft: 15517 corp: 41/1408b lim: 90 exec/s: 32 rss: 69Mb 00:08:26.313 ###### Recommended dictionary. ###### 00:08:26.313 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:26.313 "\015\000\000\000" # Uses: 3 00:08:26.313 "\000\002\000\000" # Uses: 1 00:08:26.313 "\001.\362vc8@ " # Uses: 0 00:08:26.314 "\001\000\000\000" # Uses: 0 00:08:26.314 ###### End of recommended dictionary. ###### 00:08:26.314 Done 64 runs in 2 second(s) 00:08:26.570 16:16:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:26.570 16:16:55 -- ../common.sh@72 -- # (( i++ )) 00:08:26.570 16:16:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.570 16:16:55 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:26.570 16:16:55 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:26.570 16:16:55 -- nvmf/run.sh@24 -- # local timen=1 00:08:26.570 16:16:55 -- nvmf/run.sh@25 -- # local core=0x1 00:08:26.570 16:16:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:26.570 16:16:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:26.570 16:16:55 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:26.571 16:16:55 -- nvmf/run.sh@29 -- # port=4421 00:08:26.571 16:16:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:26.571 16:16:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:26.571 16:16:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:26.571 16:16:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:26.571 [2024-07-20 16:16:55.225625] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:26.571 [2024-07-20 16:16:55.225719] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2278810 ] 00:08:26.571 EAL: No free 2048 kB hugepages reported on node 1 00:08:26.828 [2024-07-20 16:16:55.401616] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.828 [2024-07-20 16:16:55.420999] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:26.828 [2024-07-20 16:16:55.421118] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.828 [2024-07-20 16:16:55.472490] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.828 [2024-07-20 16:16:55.488780] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:26.828 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.828 INFO: Seed: 862193447 00:08:26.828 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:26.828 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:26.828 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:26.828 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.828 #2 INITED exec/s: 0 rss: 60Mb 00:08:26.828 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.828 This may also happen if the target rejected all inputs we tried so far 00:08:26.828 [2024-07-20 16:16:55.534207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.828 [2024-07-20 16:16:55.534239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.828 [2024-07-20 16:16:55.534271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.828 [2024-07-20 16:16:55.534291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.828 [2024-07-20 16:16:55.534348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.828 [2024-07-20 16:16:55.534365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.095 NEW_FUNC[1/671]: 0x4b6b60 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:27.096 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:27.096 #23 NEW cov: 11549 ft: 11564 corp: 2/35b lim: 50 exec/s: 0 rss: 66Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:27.096 [2024-07-20 16:16:55.854966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.096 [2024-07-20 16:16:55.855000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.096 [2024-07-20 16:16:55.855045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.096 [2024-07-20 16:16:55.855061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.096 [2024-07-20 16:16:55.855120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.096 [2024-07-20 16:16:55.855151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.096 NEW_FUNC[1/1]: 0x4d9120 in malloc_completion_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/bdev/malloc/bdev_malloc.c:849 00:08:27.096 #26 NEW cov: 11683 ft: 12159 corp: 3/74b lim: 50 exec/s: 0 rss: 66Mb L: 39/39 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:08:27.096 [2024-07-20 16:16:55.894986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.096 [2024-07-20 16:16:55.895017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.096 [2024-07-20 16:16:55.895050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.096 [2024-07-20 16:16:55.895066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.096 [2024-07-20 16:16:55.895125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.096 [2024-07-20 16:16:55.895141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.359 #27 NEW cov: 11689 ft: 12415 corp: 4/113b lim: 50 exec/s: 0 rss: 66Mb L: 39/39 MS: 1 ChangeBinInt- 00:08:27.359 [2024-07-20 16:16:55.935104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.359 [2024-07-20 16:16:55.935133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:55.935178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.359 [2024-07-20 16:16:55.935193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:55.935251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.359 [2024-07-20 16:16:55.935268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.359 #28 NEW cov: 11774 ft: 12710 corp: 5/152b lim: 50 exec/s: 0 rss: 66Mb L: 39/39 MS: 1 ChangeBit- 00:08:27.359 [2024-07-20 16:16:55.975382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.359 [2024-07-20 16:16:55.975415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:55.975462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.359 [2024-07-20 16:16:55.975478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:55.975534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.359 [2024-07-20 16:16:55.975550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:55.975607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.359 [2024-07-20 16:16:55.975622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.359 #29 NEW cov: 11774 ft: 13093 corp: 6/192b lim: 50 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 InsertByte- 00:08:27.359 [2024-07-20 16:16:56.015462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.359 [2024-07-20 16:16:56.015492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.015539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.359 [2024-07-20 16:16:56.015557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.015615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.359 [2024-07-20 16:16:56.015632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.015689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.359 [2024-07-20 16:16:56.015705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.359 #30 NEW cov: 11774 ft: 13138 corp: 7/232b lim: 50 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 InsertByte- 00:08:27.359 [2024-07-20 16:16:56.055600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.359 [2024-07-20 16:16:56.055629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.055673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.359 [2024-07-20 16:16:56.055689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.055745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.359 [2024-07-20 16:16:56.055761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.055818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.359 [2024-07-20 16:16:56.055834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.359 #31 NEW cov: 11774 ft: 13257 corp: 8/272b lim: 50 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeByte- 00:08:27.359 [2024-07-20 16:16:56.095576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.359 [2024-07-20 16:16:56.095605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.095640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.359 [2024-07-20 16:16:56.095656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.095714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.359 [2024-07-20 16:16:56.095729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.359 #32 NEW cov: 11774 ft: 13283 corp: 9/304b lim: 50 exec/s: 0 rss: 67Mb L: 32/40 MS: 1 EraseBytes- 00:08:27.359 [2024-07-20 16:16:56.135838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.359 [2024-07-20 16:16:56.135867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.135910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.359 [2024-07-20 16:16:56.135926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.135980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.359 [2024-07-20 16:16:56.136013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.359 [2024-07-20 16:16:56.136070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.359 [2024-07-20 16:16:56.136085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.619 #33 NEW cov: 11774 ft: 13362 corp: 10/346b lim: 50 exec/s: 0 rss: 67Mb L: 42/42 MS: 1 CMP- DE: "\001\005"- 00:08:27.619 [2024-07-20 16:16:56.175803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.619 [2024-07-20 16:16:56.175831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.175864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.619 [2024-07-20 16:16:56.175878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.175932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.619 [2024-07-20 16:16:56.175948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.619 #34 NEW cov: 11774 ft: 13456 corp: 11/380b lim: 50 exec/s: 0 rss: 67Mb L: 34/42 MS: 1 EraseBytes- 00:08:27.619 [2024-07-20 16:16:56.215914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.619 [2024-07-20 16:16:56.215942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.215987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.619 [2024-07-20 16:16:56.216002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.216061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.619 [2024-07-20 16:16:56.216077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.619 #35 NEW cov: 11774 ft: 13514 corp: 12/414b lim: 50 exec/s: 0 rss: 67Mb L: 34/42 MS: 1 CMP- DE: "\007\000"- 00:08:27.619 [2024-07-20 16:16:56.256221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.619 [2024-07-20 16:16:56.256253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.256286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.619 [2024-07-20 16:16:56.256301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.256360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.619 [2024-07-20 16:16:56.256377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.256434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.619 [2024-07-20 16:16:56.256453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.619 #36 NEW cov: 11774 ft: 13536 corp: 13/461b lim: 50 exec/s: 0 rss: 67Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:27.619 [2024-07-20 16:16:56.296193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.619 [2024-07-20 16:16:56.296221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.296278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.619 [2024-07-20 16:16:56.296293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.296351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.619 [2024-07-20 16:16:56.296366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.619 #37 NEW cov: 11774 ft: 13576 corp: 14/497b lim: 50 exec/s: 0 rss: 67Mb L: 36/47 MS: 1 PersAutoDict- DE: "\001\005"- 00:08:27.619 [2024-07-20 16:16:56.336439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.619 [2024-07-20 16:16:56.336472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.336539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.619 [2024-07-20 16:16:56.336556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.336611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.619 [2024-07-20 16:16:56.336626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.336682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.619 [2024-07-20 16:16:56.336697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.619 #38 NEW cov: 11774 ft: 13612 corp: 15/539b lim: 50 exec/s: 0 rss: 67Mb L: 42/47 MS: 1 CrossOver- 00:08:27.619 [2024-07-20 16:16:56.376535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.619 [2024-07-20 16:16:56.376562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.376598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.619 [2024-07-20 16:16:56.376614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.376671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.619 [2024-07-20 16:16:56.376688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.376746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.619 [2024-07-20 16:16:56.376760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.619 #39 NEW cov: 11774 ft: 13672 corp: 16/586b lim: 50 exec/s: 0 rss: 68Mb L: 47/47 MS: 1 ShuffleBytes- 00:08:27.619 [2024-07-20 16:16:56.416662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.619 [2024-07-20 16:16:56.416691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.416731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.619 [2024-07-20 16:16:56.416747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.416803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.619 [2024-07-20 16:16:56.416819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.619 [2024-07-20 16:16:56.416876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.619 [2024-07-20 16:16:56.416892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.879 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:27.879 #40 NEW cov: 11797 ft: 13718 corp: 17/626b lim: 50 exec/s: 0 rss: 68Mb L: 40/47 MS: 1 ShuffleBytes- 00:08:27.879 [2024-07-20 16:16:56.456636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.879 [2024-07-20 16:16:56.456664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.456707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.879 [2024-07-20 16:16:56.456723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.456781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.879 [2024-07-20 16:16:56.456797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.879 #41 NEW cov: 11797 ft: 13740 corp: 18/662b lim: 50 exec/s: 0 rss: 68Mb L: 36/47 MS: 1 PersAutoDict- DE: "\007\000"- 00:08:27.879 [2024-07-20 16:16:56.496760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.879 [2024-07-20 16:16:56.496788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.496821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.879 [2024-07-20 16:16:56.496836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.496895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.879 [2024-07-20 16:16:56.496909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.879 #42 NEW cov: 11797 ft: 13772 corp: 19/696b lim: 50 exec/s: 42 rss: 68Mb L: 34/47 MS: 1 CMP- DE: "\014\000\000\000"- 00:08:27.879 [2024-07-20 16:16:56.536880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.879 [2024-07-20 16:16:56.536912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.536944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.879 [2024-07-20 16:16:56.536960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.537018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.879 [2024-07-20 16:16:56.537034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.879 #43 NEW cov: 11797 ft: 13802 corp: 20/730b lim: 50 exec/s: 43 rss: 68Mb L: 34/47 MS: 1 ChangeBinInt- 00:08:27.879 [2024-07-20 16:16:56.577124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.879 [2024-07-20 16:16:56.577151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.577193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.879 [2024-07-20 16:16:56.577210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.577267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.879 [2024-07-20 16:16:56.577284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.879 [2024-07-20 16:16:56.577340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.879 [2024-07-20 16:16:56.577355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.879 #44 NEW cov: 11797 ft: 13841 corp: 21/779b lim: 50 exec/s: 44 rss: 68Mb L: 49/49 MS: 1 CrossOver- 00:08:27.879 [2024-07-20 16:16:56.617468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.879 [2024-07-20 16:16:56.617496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.880 [2024-07-20 16:16:56.617566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.880 [2024-07-20 16:16:56.617580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.880 [2024-07-20 16:16:56.617639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.880 [2024-07-20 16:16:56.617655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.880 [2024-07-20 16:16:56.617712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.880 [2024-07-20 16:16:56.617727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.880 [2024-07-20 16:16:56.617786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:27.880 [2024-07-20 16:16:56.617803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.880 #45 NEW cov: 11797 ft: 13900 corp: 22/829b lim: 50 exec/s: 45 rss: 68Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:27.880 [2024-07-20 16:16:56.657242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.880 [2024-07-20 16:16:56.657268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.880 [2024-07-20 16:16:56.657300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.880 [2024-07-20 16:16:56.657318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.880 [2024-07-20 16:16:56.657374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.880 [2024-07-20 16:16:56.657390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.880 #46 NEW cov: 11797 ft: 13914 corp: 23/863b lim: 50 exec/s: 46 rss: 68Mb L: 34/50 MS: 1 CopyPart- 00:08:28.139 [2024-07-20 16:16:56.687169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.139 [2024-07-20 16:16:56.687198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.687242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.139 [2024-07-20 16:16:56.687257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.139 #47 NEW cov: 11797 ft: 14264 corp: 24/891b lim: 50 exec/s: 47 rss: 68Mb L: 28/50 MS: 1 EraseBytes- 00:08:28.139 [2024-07-20 16:16:56.727473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.139 [2024-07-20 16:16:56.727501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.727538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.139 [2024-07-20 16:16:56.727554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.727614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.139 [2024-07-20 16:16:56.727630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.139 #48 NEW cov: 11797 ft: 14277 corp: 25/923b lim: 50 exec/s: 48 rss: 68Mb L: 32/50 MS: 1 ChangeBit- 00:08:28.139 [2024-07-20 16:16:56.767727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.139 [2024-07-20 16:16:56.767755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.767797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.139 [2024-07-20 16:16:56.767814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.767869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.139 [2024-07-20 16:16:56.767886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.767943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.139 [2024-07-20 16:16:56.767959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.139 #49 NEW cov: 11797 ft: 14282 corp: 26/963b lim: 50 exec/s: 49 rss: 68Mb L: 40/50 MS: 1 ChangeBit- 00:08:28.139 [2024-07-20 16:16:56.807884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.139 [2024-07-20 16:16:56.807912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.807960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.139 [2024-07-20 16:16:56.807977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.808034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.139 [2024-07-20 16:16:56.808051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.808108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.139 [2024-07-20 16:16:56.808125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.139 #50 NEW cov: 11797 ft: 14285 corp: 27/1010b lim: 50 exec/s: 50 rss: 68Mb L: 47/50 MS: 1 ChangeByte- 00:08:28.139 [2024-07-20 16:16:56.847976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.139 [2024-07-20 16:16:56.848003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.848053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.139 [2024-07-20 16:16:56.848070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.848124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.139 [2024-07-20 16:16:56.848138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.848193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.139 [2024-07-20 16:16:56.848208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.139 #51 NEW cov: 11797 ft: 14344 corp: 28/1050b lim: 50 exec/s: 51 rss: 68Mb L: 40/50 MS: 1 ChangeBinInt- 00:08:28.139 [2024-07-20 16:16:56.887640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.139 [2024-07-20 16:16:56.887667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.139 #52 NEW cov: 11797 ft: 15111 corp: 29/1067b lim: 50 exec/s: 52 rss: 68Mb L: 17/50 MS: 1 EraseBytes- 00:08:28.139 [2024-07-20 16:16:56.928239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.139 [2024-07-20 16:16:56.928267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.928303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.139 [2024-07-20 16:16:56.928320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.928374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.139 [2024-07-20 16:16:56.928391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.139 [2024-07-20 16:16:56.928449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.139 [2024-07-20 16:16:56.928464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.399 #53 NEW cov: 11797 ft: 15132 corp: 30/1109b lim: 50 exec/s: 53 rss: 69Mb L: 42/50 MS: 1 CMP- DE: "\001\000"- 00:08:28.399 [2024-07-20 16:16:56.968187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.399 [2024-07-20 16:16:56.968215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:56.968248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.399 [2024-07-20 16:16:56.968265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:56.968319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.399 [2024-07-20 16:16:56.968351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.399 #54 NEW cov: 11797 ft: 15164 corp: 31/1141b lim: 50 exec/s: 54 rss: 69Mb L: 32/50 MS: 1 ShuffleBytes- 00:08:28.399 [2024-07-20 16:16:57.008421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.399 [2024-07-20 16:16:57.008452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.008504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.399 [2024-07-20 16:16:57.008519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.008574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.399 [2024-07-20 16:16:57.008590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.008645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.399 [2024-07-20 16:16:57.008660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.399 #60 NEW cov: 11797 ft: 15169 corp: 32/1183b lim: 50 exec/s: 60 rss: 69Mb L: 42/50 MS: 1 ChangeBinInt- 00:08:28.399 [2024-07-20 16:16:57.048397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.399 [2024-07-20 16:16:57.048424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.048465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.399 [2024-07-20 16:16:57.048481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.048537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.399 [2024-07-20 16:16:57.048554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.399 #61 NEW cov: 11797 ft: 15173 corp: 33/1222b lim: 50 exec/s: 61 rss: 69Mb L: 39/50 MS: 1 ChangeBinInt- 00:08:28.399 [2024-07-20 16:16:57.078654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.399 [2024-07-20 16:16:57.078682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.078723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.399 [2024-07-20 16:16:57.078736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.078790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.399 [2024-07-20 16:16:57.078806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.078861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.399 [2024-07-20 16:16:57.078876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.399 #62 NEW cov: 11797 ft: 15220 corp: 34/1269b lim: 50 exec/s: 62 rss: 69Mb L: 47/50 MS: 1 ShuffleBytes- 00:08:28.399 [2024-07-20 16:16:57.118620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.399 [2024-07-20 16:16:57.118648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.118684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.399 [2024-07-20 16:16:57.118700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.118757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.399 [2024-07-20 16:16:57.118771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.399 #63 NEW cov: 11797 ft: 15277 corp: 35/1305b lim: 50 exec/s: 63 rss: 69Mb L: 36/50 MS: 1 PersAutoDict- DE: "\014\000\000\000"- 00:08:28.399 [2024-07-20 16:16:57.158760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.399 [2024-07-20 16:16:57.158787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.158823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.399 [2024-07-20 16:16:57.158838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.158895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.399 [2024-07-20 16:16:57.158911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.399 #64 NEW cov: 11797 ft: 15280 corp: 36/1341b lim: 50 exec/s: 64 rss: 69Mb L: 36/50 MS: 1 CrossOver- 00:08:28.399 [2024-07-20 16:16:57.199175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.399 [2024-07-20 16:16:57.199203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.199255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.399 [2024-07-20 16:16:57.199271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.199327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.399 [2024-07-20 16:16:57.199344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.199373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.399 [2024-07-20 16:16:57.199387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.399 [2024-07-20 16:16:57.199444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:28.399 [2024-07-20 16:16:57.199460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.678 #65 NEW cov: 11797 ft: 15299 corp: 37/1391b lim: 50 exec/s: 65 rss: 69Mb L: 50/50 MS: 1 ChangeByte- 00:08:28.678 [2024-07-20 16:16:57.239124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.678 [2024-07-20 16:16:57.239151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.239202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.678 [2024-07-20 16:16:57.239222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.239279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.678 [2024-07-20 16:16:57.239297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.239352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.678 [2024-07-20 16:16:57.239368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.678 #66 NEW cov: 11797 ft: 15334 corp: 38/1433b lim: 50 exec/s: 66 rss: 69Mb L: 42/50 MS: 1 ShuffleBytes- 00:08:28.678 [2024-07-20 16:16:57.279275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.678 [2024-07-20 16:16:57.279303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.279338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.678 [2024-07-20 16:16:57.279353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.279408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.678 [2024-07-20 16:16:57.279424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.279483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.678 [2024-07-20 16:16:57.279499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.678 #67 NEW cov: 11797 ft: 15336 corp: 39/1482b lim: 50 exec/s: 67 rss: 69Mb L: 49/50 MS: 1 InsertRepeatedBytes- 00:08:28.678 [2024-07-20 16:16:57.319357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.678 [2024-07-20 16:16:57.319385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.319429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.678 [2024-07-20 16:16:57.319448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.319502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.678 [2024-07-20 16:16:57.319518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.319574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.678 [2024-07-20 16:16:57.319589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.678 #68 NEW cov: 11797 ft: 15347 corp: 40/1525b lim: 50 exec/s: 68 rss: 69Mb L: 43/50 MS: 1 InsertByte- 00:08:28.678 [2024-07-20 16:16:57.359366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.678 [2024-07-20 16:16:57.359395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.359427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.678 [2024-07-20 16:16:57.359447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.678 [2024-07-20 16:16:57.359508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.678 [2024-07-20 16:16:57.359524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.678 #69 NEW cov: 11797 ft: 15355 corp: 41/1559b lim: 50 exec/s: 69 rss: 69Mb L: 34/50 MS: 1 ChangeBit- 00:08:28.678 [2024-07-20 16:16:57.399267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.679 [2024-07-20 16:16:57.399295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.679 [2024-07-20 16:16:57.399343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.679 [2024-07-20 16:16:57.399360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.679 #70 NEW cov: 11797 ft: 15356 corp: 42/1587b lim: 50 exec/s: 70 rss: 69Mb L: 28/50 MS: 1 CrossOver- 00:08:28.679 [2024-07-20 16:16:57.439588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.679 [2024-07-20 16:16:57.439617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.679 [2024-07-20 16:16:57.439649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.679 [2024-07-20 16:16:57.439664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.679 [2024-07-20 16:16:57.439720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.679 [2024-07-20 16:16:57.439736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.679 #71 NEW cov: 11797 ft: 15417 corp: 43/1621b lim: 50 exec/s: 71 rss: 69Mb L: 34/50 MS: 1 ChangeBinInt- 00:08:28.679 [2024-07-20 16:16:57.479871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.679 [2024-07-20 16:16:57.479901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.679 [2024-07-20 16:16:57.479941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.679 [2024-07-20 16:16:57.479957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.679 [2024-07-20 16:16:57.480011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.679 [2024-07-20 16:16:57.480026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.679 [2024-07-20 16:16:57.480084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.679 [2024-07-20 16:16:57.480100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.938 #72 NEW cov: 11797 ft: 15452 corp: 44/1668b lim: 50 exec/s: 72 rss: 69Mb L: 47/50 MS: 1 ChangeBinInt- 00:08:28.938 [2024-07-20 16:16:57.519971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.938 [2024-07-20 16:16:57.520001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.938 [2024-07-20 16:16:57.520054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.938 [2024-07-20 16:16:57.520069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.938 [2024-07-20 16:16:57.520126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.938 [2024-07-20 16:16:57.520145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.938 [2024-07-20 16:16:57.520199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.938 [2024-07-20 16:16:57.520214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.938 #73 NEW cov: 11797 ft: 15463 corp: 45/1711b lim: 50 exec/s: 36 rss: 70Mb L: 43/50 MS: 1 InsertByte- 00:08:28.938 #73 DONE cov: 11797 ft: 15463 corp: 45/1711b lim: 50 exec/s: 36 rss: 70Mb 00:08:28.938 ###### Recommended dictionary. ###### 00:08:28.938 "\001\005" # Uses: 2 00:08:28.938 "\007\000" # Uses: 1 00:08:28.938 "\014\000\000\000" # Uses: 2 00:08:28.938 "\001\000" # Uses: 0 00:08:28.938 ###### End of recommended dictionary. ###### 00:08:28.938 Done 73 runs in 2 second(s) 00:08:28.938 16:16:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:28.938 16:16:57 -- ../common.sh@72 -- # (( i++ )) 00:08:28.938 16:16:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.938 16:16:57 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:28.938 16:16:57 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:28.938 16:16:57 -- nvmf/run.sh@24 -- # local timen=1 00:08:28.938 16:16:57 -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.938 16:16:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:28.938 16:16:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:28.938 16:16:57 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:28.938 16:16:57 -- nvmf/run.sh@29 -- # port=4422 00:08:28.938 16:16:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:28.938 16:16:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:28.938 16:16:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.938 16:16:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:28.938 [2024-07-20 16:16:57.688109] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:28.938 [2024-07-20 16:16:57.688202] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2279186 ] 00:08:28.938 EAL: No free 2048 kB hugepages reported on node 1 00:08:29.196 [2024-07-20 16:16:57.873703] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.196 [2024-07-20 16:16:57.893335] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:29.196 [2024-07-20 16:16:57.893482] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.196 [2024-07-20 16:16:57.944921] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.196 [2024-07-20 16:16:57.961251] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:29.196 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.196 INFO: Seed: 3333209100 00:08:29.454 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:29.454 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:29.454 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:29.454 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.454 #2 INITED exec/s: 0 rss: 59Mb 00:08:29.454 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.454 This may also happen if the target rejected all inputs we tried so far 00:08:29.454 [2024-07-20 16:16:58.031119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.454 [2024-07-20 16:16:58.031155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.454 [2024-07-20 16:16:58.031263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.454 [2024-07-20 16:16:58.031290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.454 [2024-07-20 16:16:58.031410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.454 [2024-07-20 16:16:58.031430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.454 [2024-07-20 16:16:58.031551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.454 [2024-07-20 16:16:58.031573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.714 NEW_FUNC[1/672]: 0x4b8e20 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:29.714 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:29.714 #12 NEW cov: 11593 ft: 11589 corp: 2/72b lim: 85 exec/s: 0 rss: 66Mb L: 71/71 MS: 5 InsertRepeatedBytes-ChangeByte-ShuffleBytes-EraseBytes-InsertRepeatedBytes- 00:08:29.714 [2024-07-20 16:16:58.351838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.714 [2024-07-20 16:16:58.351875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.352000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.714 [2024-07-20 16:16:58.352021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.352127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.714 [2024-07-20 16:16:58.352149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.352270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.714 [2024-07-20 16:16:58.352295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.714 #13 NEW cov: 11709 ft: 12047 corp: 3/146b lim: 85 exec/s: 0 rss: 66Mb L: 74/74 MS: 1 CopyPart- 00:08:29.714 [2024-07-20 16:16:58.401978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.714 [2024-07-20 16:16:58.402014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.402113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.714 [2024-07-20 16:16:58.402134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.402263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.714 [2024-07-20 16:16:58.402284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.402394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.714 [2024-07-20 16:16:58.402413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.714 #14 NEW cov: 11715 ft: 12478 corp: 4/220b lim: 85 exec/s: 0 rss: 66Mb L: 74/74 MS: 1 ChangeBinInt- 00:08:29.714 [2024-07-20 16:16:58.441687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.714 [2024-07-20 16:16:58.441719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.441827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.714 [2024-07-20 16:16:58.441846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.441959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.714 [2024-07-20 16:16:58.441978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.714 #15 NEW cov: 11800 ft: 13113 corp: 5/280b lim: 85 exec/s: 0 rss: 66Mb L: 60/74 MS: 1 InsertRepeatedBytes- 00:08:29.714 [2024-07-20 16:16:58.482116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.714 [2024-07-20 16:16:58.482148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.482261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.714 [2024-07-20 16:16:58.482284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.482394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.714 [2024-07-20 16:16:58.482412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.714 [2024-07-20 16:16:58.482534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.714 [2024-07-20 16:16:58.482557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.714 #16 NEW cov: 11800 ft: 13194 corp: 6/352b lim: 85 exec/s: 0 rss: 66Mb L: 72/74 MS: 1 InsertByte- 00:08:29.974 [2024-07-20 16:16:58.521776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.974 [2024-07-20 16:16:58.521810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.521920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.974 [2024-07-20 16:16:58.521944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.974 #17 NEW cov: 11800 ft: 13571 corp: 7/400b lim: 85 exec/s: 0 rss: 66Mb L: 48/74 MS: 1 CrossOver- 00:08:29.974 [2024-07-20 16:16:58.562151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.974 [2024-07-20 16:16:58.562183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.562305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.974 [2024-07-20 16:16:58.562323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.562454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.974 [2024-07-20 16:16:58.562476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.974 #18 NEW cov: 11800 ft: 13653 corp: 8/460b lim: 85 exec/s: 0 rss: 66Mb L: 60/74 MS: 1 ChangeByte- 00:08:29.974 [2024-07-20 16:16:58.612670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.974 [2024-07-20 16:16:58.612706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.612834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.974 [2024-07-20 16:16:58.612863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.612979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.974 [2024-07-20 16:16:58.612998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.613120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.974 [2024-07-20 16:16:58.613143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.974 #19 NEW cov: 11800 ft: 13682 corp: 9/531b lim: 85 exec/s: 0 rss: 66Mb L: 71/74 MS: 1 ChangeBit- 00:08:29.974 [2024-07-20 16:16:58.652657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.974 [2024-07-20 16:16:58.652692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.652760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.974 [2024-07-20 16:16:58.652783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.652899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.974 [2024-07-20 16:16:58.652927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.653046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.974 [2024-07-20 16:16:58.653071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.974 #20 NEW cov: 11800 ft: 13714 corp: 10/605b lim: 85 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 ShuffleBytes- 00:08:29.974 [2024-07-20 16:16:58.692584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.974 [2024-07-20 16:16:58.692616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.692720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.974 [2024-07-20 16:16:58.692746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.692868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.974 [2024-07-20 16:16:58.692891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.974 #21 NEW cov: 11800 ft: 13810 corp: 11/665b lim: 85 exec/s: 0 rss: 67Mb L: 60/74 MS: 1 CrossOver- 00:08:29.974 [2024-07-20 16:16:58.732712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.974 [2024-07-20 16:16:58.732744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.732858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.974 [2024-07-20 16:16:58.732886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.732996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.974 [2024-07-20 16:16:58.733022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.974 #22 NEW cov: 11800 ft: 13843 corp: 12/725b lim: 85 exec/s: 0 rss: 67Mb L: 60/74 MS: 1 CrossOver- 00:08:29.974 [2024-07-20 16:16:58.772895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.974 [2024-07-20 16:16:58.772933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.773053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.974 [2024-07-20 16:16:58.773076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.974 [2024-07-20 16:16:58.773192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.974 [2024-07-20 16:16:58.773214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.234 #23 NEW cov: 11800 ft: 13900 corp: 13/785b lim: 85 exec/s: 0 rss: 67Mb L: 60/74 MS: 1 ChangeBinInt- 00:08:30.234 [2024-07-20 16:16:58.813197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.234 [2024-07-20 16:16:58.813227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.813332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.234 [2024-07-20 16:16:58.813358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.813455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.234 [2024-07-20 16:16:58.813480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.813598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.234 [2024-07-20 16:16:58.813623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.234 #24 NEW cov: 11800 ft: 13915 corp: 14/859b lim: 85 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 ChangeBinInt- 00:08:30.234 [2024-07-20 16:16:58.853220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.234 [2024-07-20 16:16:58.853254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.853359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.234 [2024-07-20 16:16:58.853386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.853501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.234 [2024-07-20 16:16:58.853521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.853640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.234 [2024-07-20 16:16:58.853666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.234 #25 NEW cov: 11800 ft: 13916 corp: 15/933b lim: 85 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 CMP- DE: "\377\377"- 00:08:30.234 [2024-07-20 16:16:58.893389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.234 [2024-07-20 16:16:58.893422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.893547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.234 [2024-07-20 16:16:58.893567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.893683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.234 [2024-07-20 16:16:58.893708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.893825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.234 [2024-07-20 16:16:58.893847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.234 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:30.234 #26 NEW cov: 11823 ft: 13968 corp: 16/1008b lim: 85 exec/s: 0 rss: 67Mb L: 75/75 MS: 1 InsertByte- 00:08:30.234 [2024-07-20 16:16:58.943065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.234 [2024-07-20 16:16:58.943099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.943224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.234 [2024-07-20 16:16:58.943249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.234 #27 NEW cov: 11823 ft: 14043 corp: 17/1055b lim: 85 exec/s: 0 rss: 67Mb L: 47/75 MS: 1 EraseBytes- 00:08:30.234 [2024-07-20 16:16:58.983504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.234 [2024-07-20 16:16:58.983538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.234 [2024-07-20 16:16:58.983649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.235 [2024-07-20 16:16:58.983676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.235 [2024-07-20 16:16:58.983794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.235 [2024-07-20 16:16:58.983820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.235 #28 NEW cov: 11823 ft: 14053 corp: 18/1115b lim: 85 exec/s: 28 rss: 67Mb L: 60/75 MS: 1 CrossOver- 00:08:30.235 [2024-07-20 16:16:59.023561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.235 [2024-07-20 16:16:59.023596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.235 [2024-07-20 16:16:59.023718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.235 [2024-07-20 16:16:59.023742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.235 [2024-07-20 16:16:59.023864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.235 [2024-07-20 16:16:59.023885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.495 #29 NEW cov: 11823 ft: 14111 corp: 19/1176b lim: 85 exec/s: 29 rss: 67Mb L: 61/75 MS: 1 InsertByte- 00:08:30.495 [2024-07-20 16:16:59.063650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.495 [2024-07-20 16:16:59.063685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.063819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.495 [2024-07-20 16:16:59.063837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.063949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.495 [2024-07-20 16:16:59.063969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.495 #30 NEW cov: 11823 ft: 14123 corp: 20/1237b lim: 85 exec/s: 30 rss: 67Mb L: 61/75 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:30.495 [2024-07-20 16:16:59.113877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.495 [2024-07-20 16:16:59.113908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.113993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.495 [2024-07-20 16:16:59.114015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.114138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.495 [2024-07-20 16:16:59.114165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.495 #31 NEW cov: 11823 ft: 14175 corp: 21/1297b lim: 85 exec/s: 31 rss: 67Mb L: 60/75 MS: 1 ChangeBit- 00:08:30.495 [2024-07-20 16:16:59.154010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.495 [2024-07-20 16:16:59.154042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.154159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.495 [2024-07-20 16:16:59.154185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.154307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.495 [2024-07-20 16:16:59.154335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.495 #32 NEW cov: 11823 ft: 14201 corp: 22/1358b lim: 85 exec/s: 32 rss: 67Mb L: 61/75 MS: 1 ChangeByte- 00:08:30.495 [2024-07-20 16:16:59.194315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.495 [2024-07-20 16:16:59.194348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.194405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.495 [2024-07-20 16:16:59.194431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.194547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.495 [2024-07-20 16:16:59.194570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.194682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.495 [2024-07-20 16:16:59.194706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.234428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.495 [2024-07-20 16:16:59.234468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.234588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.495 [2024-07-20 16:16:59.234611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.234728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.495 [2024-07-20 16:16:59.234749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.234868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.495 [2024-07-20 16:16:59.234896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.495 #34 NEW cov: 11823 ft: 14206 corp: 23/1429b lim: 85 exec/s: 34 rss: 68Mb L: 71/75 MS: 2 ShuffleBytes-ChangeByte- 00:08:30.495 [2024-07-20 16:16:59.274258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.495 [2024-07-20 16:16:59.274291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.274411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.495 [2024-07-20 16:16:59.274437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.495 [2024-07-20 16:16:59.274558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.495 [2024-07-20 16:16:59.274580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.495 #35 NEW cov: 11823 ft: 14210 corp: 24/1490b lim: 85 exec/s: 35 rss: 68Mb L: 61/75 MS: 1 ChangeBit- 00:08:30.755 [2024-07-20 16:16:59.314652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.755 [2024-07-20 16:16:59.314687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.314769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.755 [2024-07-20 16:16:59.314791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.314906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.755 [2024-07-20 16:16:59.314922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.315037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.755 [2024-07-20 16:16:59.315062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.755 #36 NEW cov: 11823 ft: 14296 corp: 25/1564b lim: 85 exec/s: 36 rss: 68Mb L: 74/75 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:30.755 [2024-07-20 16:16:59.364332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.755 [2024-07-20 16:16:59.364359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.364465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.755 [2024-07-20 16:16:59.364490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.755 #37 NEW cov: 11823 ft: 14322 corp: 26/1612b lim: 85 exec/s: 37 rss: 68Mb L: 48/75 MS: 1 ChangeBit- 00:08:30.755 [2024-07-20 16:16:59.415018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.755 [2024-07-20 16:16:59.415049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.415113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.755 [2024-07-20 16:16:59.415136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.415252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.755 [2024-07-20 16:16:59.415273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.415396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.755 [2024-07-20 16:16:59.415416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.755 #38 NEW cov: 11823 ft: 14330 corp: 27/1686b lim: 85 exec/s: 38 rss: 68Mb L: 74/75 MS: 1 CrossOver- 00:08:30.755 [2024-07-20 16:16:59.454790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.755 [2024-07-20 16:16:59.454821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.454865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.755 [2024-07-20 16:16:59.454886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.455000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.755 [2024-07-20 16:16:59.455024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.755 #39 NEW cov: 11823 ft: 14376 corp: 28/1746b lim: 85 exec/s: 39 rss: 68Mb L: 60/75 MS: 1 CMP- DE: "\005\000"- 00:08:30.755 [2024-07-20 16:16:59.495001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.755 [2024-07-20 16:16:59.495036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.495155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.755 [2024-07-20 16:16:59.495177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.495289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.755 [2024-07-20 16:16:59.495314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.755 #40 NEW cov: 11823 ft: 14377 corp: 29/1808b lim: 85 exec/s: 40 rss: 68Mb L: 62/75 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:30.755 [2024-07-20 16:16:59.534805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.755 [2024-07-20 16:16:59.534835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.755 [2024-07-20 16:16:59.534942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.755 [2024-07-20 16:16:59.534963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.015 #41 NEW cov: 11823 ft: 14387 corp: 30/1855b lim: 85 exec/s: 41 rss: 68Mb L: 47/75 MS: 1 ShuffleBytes- 00:08:31.015 [2024-07-20 16:16:59.575356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.015 [2024-07-20 16:16:59.575392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.575473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.015 [2024-07-20 16:16:59.575494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.575613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.015 [2024-07-20 16:16:59.575634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.575750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:31.015 [2024-07-20 16:16:59.575768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.015 #42 NEW cov: 11823 ft: 14402 corp: 31/1930b lim: 85 exec/s: 42 rss: 68Mb L: 75/75 MS: 1 CrossOver- 00:08:31.015 [2024-07-20 16:16:59.615483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.015 [2024-07-20 16:16:59.615513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.615582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.015 [2024-07-20 16:16:59.615605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.615723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.015 [2024-07-20 16:16:59.615744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.615865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:31.015 [2024-07-20 16:16:59.615884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.015 #43 NEW cov: 11823 ft: 14409 corp: 32/2005b lim: 85 exec/s: 43 rss: 68Mb L: 75/75 MS: 1 ChangeByte- 00:08:31.015 [2024-07-20 16:16:59.655674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.015 [2024-07-20 16:16:59.655706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.655775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.015 [2024-07-20 16:16:59.655798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.655912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.015 [2024-07-20 16:16:59.655939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.656046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:31.015 [2024-07-20 16:16:59.656067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.015 #44 NEW cov: 11823 ft: 14435 corp: 33/2082b lim: 85 exec/s: 44 rss: 68Mb L: 77/77 MS: 1 CMP- DE: "\014\000"- 00:08:31.015 [2024-07-20 16:16:59.695319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.015 [2024-07-20 16:16:59.695351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.695472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.015 [2024-07-20 16:16:59.695496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.015 #45 NEW cov: 11823 ft: 14449 corp: 34/2129b lim: 85 exec/s: 45 rss: 68Mb L: 47/77 MS: 1 ChangeBinInt- 00:08:31.015 [2024-07-20 16:16:59.735905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.015 [2024-07-20 16:16:59.735936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.736016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.015 [2024-07-20 16:16:59.736042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.736152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.015 [2024-07-20 16:16:59.736177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.736293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:31.015 [2024-07-20 16:16:59.736314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.015 #51 NEW cov: 11823 ft: 14463 corp: 35/2204b lim: 85 exec/s: 51 rss: 69Mb L: 75/77 MS: 1 CopyPart- 00:08:31.015 [2024-07-20 16:16:59.775422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.015 [2024-07-20 16:16:59.775455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.015 #54 NEW cov: 11823 ft: 15271 corp: 36/2222b lim: 85 exec/s: 54 rss: 69Mb L: 18/77 MS: 3 CrossOver-ChangeBit-PersAutoDict- DE: "\014\000"- 00:08:31.015 [2024-07-20 16:16:59.815739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.015 [2024-07-20 16:16:59.815771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.015 [2024-07-20 16:16:59.815891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.015 [2024-07-20 16:16:59.815915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.275 #55 NEW cov: 11823 ft: 15292 corp: 37/2269b lim: 85 exec/s: 55 rss: 69Mb L: 47/77 MS: 1 EraseBytes- 00:08:31.275 [2024-07-20 16:16:59.856321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.275 [2024-07-20 16:16:59.856354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.856440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.275 [2024-07-20 16:16:59.856462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.856589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.275 [2024-07-20 16:16:59.856615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.856734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:31.275 [2024-07-20 16:16:59.856759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.275 #56 NEW cov: 11823 ft: 15368 corp: 38/2343b lim: 85 exec/s: 56 rss: 69Mb L: 74/77 MS: 1 CrossOver- 00:08:31.275 [2024-07-20 16:16:59.896202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.275 [2024-07-20 16:16:59.896237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.896365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.275 [2024-07-20 16:16:59.896390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.896515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.275 [2024-07-20 16:16:59.896537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.275 #57 NEW cov: 11823 ft: 15382 corp: 39/2404b lim: 85 exec/s: 57 rss: 69Mb L: 61/77 MS: 1 CrossOver- 00:08:31.275 [2024-07-20 16:16:59.936301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.275 [2024-07-20 16:16:59.936332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.936446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.275 [2024-07-20 16:16:59.936469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.936586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.275 [2024-07-20 16:16:59.936607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.275 #58 NEW cov: 11823 ft: 15400 corp: 40/2465b lim: 85 exec/s: 58 rss: 69Mb L: 61/77 MS: 1 PersAutoDict- DE: "\014\000"- 00:08:31.275 [2024-07-20 16:16:59.976687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.275 [2024-07-20 16:16:59.976722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.976804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.275 [2024-07-20 16:16:59.976826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.976942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.275 [2024-07-20 16:16:59.976962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:16:59.977083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:31.275 [2024-07-20 16:16:59.977104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.275 #59 NEW cov: 11823 ft: 15404 corp: 41/2542b lim: 85 exec/s: 59 rss: 69Mb L: 77/77 MS: 1 CrossOver- 00:08:31.275 [2024-07-20 16:17:00.016970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.275 [2024-07-20 16:17:00.017006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:17:00.017129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.275 [2024-07-20 16:17:00.017156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:17:00.017272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.275 [2024-07-20 16:17:00.017294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.275 [2024-07-20 16:17:00.017418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:31.275 [2024-07-20 16:17:00.017450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.275 #60 NEW cov: 11823 ft: 15411 corp: 42/2626b lim: 85 exec/s: 30 rss: 69Mb L: 84/84 MS: 1 CrossOver- 00:08:31.275 #60 DONE cov: 11823 ft: 15411 corp: 42/2626b lim: 85 exec/s: 30 rss: 69Mb 00:08:31.275 ###### Recommended dictionary. ###### 00:08:31.275 "\377\377" # Uses: 3 00:08:31.275 "\005\000" # Uses: 0 00:08:31.275 "\014\000" # Uses: 3 00:08:31.275 ###### End of recommended dictionary. ###### 00:08:31.275 Done 60 runs in 2 second(s) 00:08:31.534 16:17:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:31.534 16:17:00 -- ../common.sh@72 -- # (( i++ )) 00:08:31.534 16:17:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.534 16:17:00 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:31.534 16:17:00 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:31.534 16:17:00 -- nvmf/run.sh@24 -- # local timen=1 00:08:31.534 16:17:00 -- nvmf/run.sh@25 -- # local core=0x1 00:08:31.534 16:17:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:31.534 16:17:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:31.534 16:17:00 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:31.534 16:17:00 -- nvmf/run.sh@29 -- # port=4423 00:08:31.534 16:17:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:31.534 16:17:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:31.534 16:17:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:31.534 16:17:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:31.534 [2024-07-20 16:17:00.195037] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:31.534 [2024-07-20 16:17:00.195115] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2279643 ] 00:08:31.534 EAL: No free 2048 kB hugepages reported on node 1 00:08:31.794 [2024-07-20 16:17:00.371160] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.794 [2024-07-20 16:17:00.391058] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:31.794 [2024-07-20 16:17:00.391195] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.794 [2024-07-20 16:17:00.442716] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.794 [2024-07-20 16:17:00.459013] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:31.794 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.794 INFO: Seed: 1537217159 00:08:31.794 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:31.794 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:31.794 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:31.794 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.794 #2 INITED exec/s: 0 rss: 59Mb 00:08:31.794 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.794 This may also happen if the target rejected all inputs we tried so far 00:08:31.794 [2024-07-20 16:17:00.524178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.794 [2024-07-20 16:17:00.524208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.794 [2024-07-20 16:17:00.524265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.794 [2024-07-20 16:17:00.524283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.053 NEW_FUNC[1/671]: 0x4bc050 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:32.053 NEW_FUNC[2/671]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.053 #4 NEW cov: 11529 ft: 11530 corp: 2/14b lim: 25 exec/s: 0 rss: 66Mb L: 13/13 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:32.053 [2024-07-20 16:17:00.855155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.053 [2024-07-20 16:17:00.855217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.053 [2024-07-20 16:17:00.855295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.053 [2024-07-20 16:17:00.855324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.312 #5 NEW cov: 11642 ft: 12128 corp: 3/28b lim: 25 exec/s: 0 rss: 66Mb L: 14/14 MS: 1 CrossOver- 00:08:32.312 [2024-07-20 16:17:00.894943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.312 [2024-07-20 16:17:00.894971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.312 [2024-07-20 16:17:00.895002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.312 [2024-07-20 16:17:00.895017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.312 #6 NEW cov: 11648 ft: 12428 corp: 4/42b lim: 25 exec/s: 0 rss: 66Mb L: 14/14 MS: 1 ChangeBit- 00:08:32.312 [2024-07-20 16:17:00.935124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.312 [2024-07-20 16:17:00.935152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.312 [2024-07-20 16:17:00.935195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.312 [2024-07-20 16:17:00.935210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.312 #7 NEW cov: 11733 ft: 12698 corp: 5/56b lim: 25 exec/s: 0 rss: 66Mb L: 14/14 MS: 1 ChangeByte- 00:08:32.312 [2024-07-20 16:17:00.975190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.312 [2024-07-20 16:17:00.975216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.312 [2024-07-20 16:17:00.975248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.312 [2024-07-20 16:17:00.975265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.312 #8 NEW cov: 11733 ft: 12836 corp: 6/70b lim: 25 exec/s: 0 rss: 66Mb L: 14/14 MS: 1 InsertByte- 00:08:32.312 [2024-07-20 16:17:01.015434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.312 [2024-07-20 16:17:01.015466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.312 [2024-07-20 16:17:01.015503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.312 [2024-07-20 16:17:01.015519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.312 [2024-07-20 16:17:01.015571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.312 [2024-07-20 16:17:01.015589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.312 #10 NEW cov: 11733 ft: 13112 corp: 7/86b lim: 25 exec/s: 0 rss: 67Mb L: 16/16 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:32.312 [2024-07-20 16:17:01.055476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.312 [2024-07-20 16:17:01.055502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.312 [2024-07-20 16:17:01.055535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.312 [2024-07-20 16:17:01.055550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.312 #11 NEW cov: 11733 ft: 13160 corp: 8/100b lim: 25 exec/s: 0 rss: 67Mb L: 14/16 MS: 1 ChangeBit- 00:08:32.312 [2024-07-20 16:17:01.095670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.312 [2024-07-20 16:17:01.095696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.312 [2024-07-20 16:17:01.095729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.312 [2024-07-20 16:17:01.095745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.312 [2024-07-20 16:17:01.095798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.312 [2024-07-20 16:17:01.095814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.572 #12 NEW cov: 11733 ft: 13236 corp: 9/115b lim: 25 exec/s: 0 rss: 67Mb L: 15/16 MS: 1 InsertByte- 00:08:32.572 [2024-07-20 16:17:01.135668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.572 [2024-07-20 16:17:01.135694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.135727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.572 [2024-07-20 16:17:01.135742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.572 #13 NEW cov: 11733 ft: 13275 corp: 10/127b lim: 25 exec/s: 0 rss: 67Mb L: 12/16 MS: 1 EraseBytes- 00:08:32.572 [2024-07-20 16:17:01.175856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.572 [2024-07-20 16:17:01.175882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.175926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.572 [2024-07-20 16:17:01.175941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.175992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.572 [2024-07-20 16:17:01.176007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.572 #14 NEW cov: 11733 ft: 13313 corp: 11/145b lim: 25 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 CMP- DE: "\003\000"- 00:08:32.572 [2024-07-20 16:17:01.216208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.572 [2024-07-20 16:17:01.216235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.216290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.572 [2024-07-20 16:17:01.216308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.216359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.572 [2024-07-20 16:17:01.216374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.216427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.572 [2024-07-20 16:17:01.216445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.216512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:32.572 [2024-07-20 16:17:01.216527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:32.572 #15 NEW cov: 11733 ft: 13795 corp: 12/170b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CrossOver- 00:08:32.572 [2024-07-20 16:17:01.255981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.572 [2024-07-20 16:17:01.256007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.256039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.572 [2024-07-20 16:17:01.256055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.572 #16 NEW cov: 11733 ft: 13809 corp: 13/184b lim: 25 exec/s: 0 rss: 67Mb L: 14/25 MS: 1 ShuffleBytes- 00:08:32.572 [2024-07-20 16:17:01.286316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.572 [2024-07-20 16:17:01.286342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.286405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.572 [2024-07-20 16:17:01.286422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.286474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.572 [2024-07-20 16:17:01.286488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.286544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.572 [2024-07-20 16:17:01.286559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.572 #17 NEW cov: 11733 ft: 13827 corp: 14/206b lim: 25 exec/s: 0 rss: 67Mb L: 22/25 MS: 1 EraseBytes- 00:08:32.572 [2024-07-20 16:17:01.326363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.572 [2024-07-20 16:17:01.326405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.326452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.572 [2024-07-20 16:17:01.326468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.326521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.572 [2024-07-20 16:17:01.326537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.572 #18 NEW cov: 11733 ft: 13847 corp: 15/225b lim: 25 exec/s: 0 rss: 67Mb L: 19/25 MS: 1 InsertRepeatedBytes- 00:08:32.572 [2024-07-20 16:17:01.366503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.572 [2024-07-20 16:17:01.366530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.366570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.572 [2024-07-20 16:17:01.366585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.572 [2024-07-20 16:17:01.366637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.572 [2024-07-20 16:17:01.366654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.833 #19 NEW cov: 11733 ft: 13910 corp: 16/243b lim: 25 exec/s: 0 rss: 67Mb L: 18/25 MS: 1 CMP- DE: "\001\000\002\000"- 00:08:32.833 [2024-07-20 16:17:01.406810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.833 [2024-07-20 16:17:01.406837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.406884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.833 [2024-07-20 16:17:01.406900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.406950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.833 [2024-07-20 16:17:01.406966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.407016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.833 [2024-07-20 16:17:01.407030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.407082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:32.833 [2024-07-20 16:17:01.407096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:32.833 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:32.833 #20 NEW cov: 11756 ft: 13955 corp: 17/268b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CopyPart- 00:08:32.833 [2024-07-20 16:17:01.446603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.833 [2024-07-20 16:17:01.446630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.446675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.833 [2024-07-20 16:17:01.446691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.833 #21 NEW cov: 11756 ft: 14001 corp: 18/282b lim: 25 exec/s: 0 rss: 67Mb L: 14/25 MS: 1 ChangeBinInt- 00:08:32.833 [2024-07-20 16:17:01.487017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.833 [2024-07-20 16:17:01.487043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.487097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.833 [2024-07-20 16:17:01.487111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.487164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.833 [2024-07-20 16:17:01.487183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.487234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.833 [2024-07-20 16:17:01.487249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.487300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:32.833 [2024-07-20 16:17:01.487315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:32.833 #22 NEW cov: 11756 ft: 14005 corp: 19/307b lim: 25 exec/s: 22 rss: 67Mb L: 25/25 MS: 1 CopyPart- 00:08:32.833 [2024-07-20 16:17:01.527055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.833 [2024-07-20 16:17:01.527081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.527125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.833 [2024-07-20 16:17:01.527140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.527192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.833 [2024-07-20 16:17:01.527206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.527258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.833 [2024-07-20 16:17:01.527273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.833 #23 NEW cov: 11756 ft: 14019 corp: 20/330b lim: 25 exec/s: 23 rss: 67Mb L: 23/25 MS: 1 InsertByte- 00:08:32.833 [2024-07-20 16:17:01.567044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.833 [2024-07-20 16:17:01.567071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.567106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.833 [2024-07-20 16:17:01.567121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.567174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.833 [2024-07-20 16:17:01.567190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.607125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.833 [2024-07-20 16:17:01.607152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.607193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.833 [2024-07-20 16:17:01.607209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.833 [2024-07-20 16:17:01.607263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.833 [2024-07-20 16:17:01.607278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.833 #25 NEW cov: 11756 ft: 14068 corp: 21/345b lim: 25 exec/s: 25 rss: 67Mb L: 15/25 MS: 2 ChangeByte-CMP- DE: "\377\377\377\033"- 00:08:33.093 [2024-07-20 16:17:01.647508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.093 [2024-07-20 16:17:01.647538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.093 [2024-07-20 16:17:01.647585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.093 [2024-07-20 16:17:01.647601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.093 [2024-07-20 16:17:01.647655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.093 [2024-07-20 16:17:01.647668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.093 [2024-07-20 16:17:01.647720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.093 [2024-07-20 16:17:01.647735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.093 [2024-07-20 16:17:01.647787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:33.093 [2024-07-20 16:17:01.647803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.093 #26 NEW cov: 11756 ft: 14110 corp: 22/370b lim: 25 exec/s: 26 rss: 67Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:33.093 [2024-07-20 16:17:01.687258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.093 [2024-07-20 16:17:01.687285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.093 [2024-07-20 16:17:01.687327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.093 [2024-07-20 16:17:01.687343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.093 #27 NEW cov: 11756 ft: 14135 corp: 23/380b lim: 25 exec/s: 27 rss: 68Mb L: 10/25 MS: 1 EraseBytes- 00:08:33.093 [2024-07-20 16:17:01.727609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.093 [2024-07-20 16:17:01.727635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.093 [2024-07-20 16:17:01.727683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.093 [2024-07-20 16:17:01.727698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.093 [2024-07-20 16:17:01.727752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.093 [2024-07-20 16:17:01.727767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.093 [2024-07-20 16:17:01.727818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.093 [2024-07-20 16:17:01.727831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.094 #28 NEW cov: 11756 ft: 14182 corp: 24/404b lim: 25 exec/s: 28 rss: 68Mb L: 24/25 MS: 1 CopyPart- 00:08:33.094 [2024-07-20 16:17:01.767669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.094 [2024-07-20 16:17:01.767695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.767728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.094 [2024-07-20 16:17:01.767744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.767797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.094 [2024-07-20 16:17:01.767816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.094 #29 NEW cov: 11756 ft: 14189 corp: 25/422b lim: 25 exec/s: 29 rss: 68Mb L: 18/25 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:08:33.094 [2024-07-20 16:17:01.807993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.094 [2024-07-20 16:17:01.808020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.808071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.094 [2024-07-20 16:17:01.808086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.808139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.094 [2024-07-20 16:17:01.808154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.808207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.094 [2024-07-20 16:17:01.808221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.808274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:33.094 [2024-07-20 16:17:01.808289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.094 #30 NEW cov: 11756 ft: 14202 corp: 26/447b lim: 25 exec/s: 30 rss: 68Mb L: 25/25 MS: 1 CopyPart- 00:08:33.094 [2024-07-20 16:17:01.847858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.094 [2024-07-20 16:17:01.847885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.847935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.094 [2024-07-20 16:17:01.847951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.848005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.094 [2024-07-20 16:17:01.848022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.094 #31 NEW cov: 11756 ft: 14215 corp: 27/466b lim: 25 exec/s: 31 rss: 68Mb L: 19/25 MS: 1 InsertByte- 00:08:33.094 [2024-07-20 16:17:01.887968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.094 [2024-07-20 16:17:01.887994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.888029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.094 [2024-07-20 16:17:01.888045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.094 [2024-07-20 16:17:01.888096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.094 [2024-07-20 16:17:01.888112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.353 #32 NEW cov: 11756 ft: 14259 corp: 28/482b lim: 25 exec/s: 32 rss: 68Mb L: 16/25 MS: 1 PersAutoDict- DE: "\003\000"- 00:08:33.353 [2024-07-20 16:17:01.928168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.353 [2024-07-20 16:17:01.928197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.353 [2024-07-20 16:17:01.928238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.353 [2024-07-20 16:17:01.928254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:01.928304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.354 [2024-07-20 16:17:01.928318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:01.928368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.354 [2024-07-20 16:17:01.928383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.354 #33 NEW cov: 11756 ft: 14268 corp: 29/505b lim: 25 exec/s: 33 rss: 68Mb L: 23/25 MS: 1 CrossOver- 00:08:33.354 [2024-07-20 16:17:01.968058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.354 [2024-07-20 16:17:01.968085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:01.968119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.354 [2024-07-20 16:17:01.968135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.354 #34 NEW cov: 11756 ft: 14274 corp: 30/516b lim: 25 exec/s: 34 rss: 68Mb L: 11/25 MS: 1 EraseBytes- 00:08:33.354 [2024-07-20 16:17:02.008180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.354 [2024-07-20 16:17:02.008206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:02.008247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.354 [2024-07-20 16:17:02.008261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.354 #35 NEW cov: 11756 ft: 14292 corp: 31/526b lim: 25 exec/s: 35 rss: 68Mb L: 10/25 MS: 1 EraseBytes- 00:08:33.354 [2024-07-20 16:17:02.048387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.354 [2024-07-20 16:17:02.048414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:02.048479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.354 [2024-07-20 16:17:02.048495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:02.048547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.354 [2024-07-20 16:17:02.048563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.354 #36 NEW cov: 11756 ft: 14305 corp: 32/541b lim: 25 exec/s: 36 rss: 68Mb L: 15/25 MS: 1 ShuffleBytes- 00:08:33.354 [2024-07-20 16:17:02.088776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.354 [2024-07-20 16:17:02.088802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:02.088855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.354 [2024-07-20 16:17:02.088871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:02.088924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.354 [2024-07-20 16:17:02.088941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:02.088992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.354 [2024-07-20 16:17:02.089006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:02.089057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:33.354 [2024-07-20 16:17:02.089072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.354 #37 NEW cov: 11756 ft: 14321 corp: 33/566b lim: 25 exec/s: 37 rss: 68Mb L: 25/25 MS: 1 ChangeByte- 00:08:33.354 [2024-07-20 16:17:02.128547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.354 [2024-07-20 16:17:02.128574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.354 [2024-07-20 16:17:02.128605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.354 [2024-07-20 16:17:02.128620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.354 #38 NEW cov: 11756 ft: 14330 corp: 34/576b lim: 25 exec/s: 38 rss: 68Mb L: 10/25 MS: 1 ShuffleBytes- 00:08:33.613 [2024-07-20 16:17:02.168890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.613 [2024-07-20 16:17:02.168918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.613 [2024-07-20 16:17:02.168965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.614 [2024-07-20 16:17:02.168981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.169035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.614 [2024-07-20 16:17:02.169051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.169105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.614 [2024-07-20 16:17:02.169119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.614 #39 NEW cov: 11756 ft: 14342 corp: 35/600b lim: 25 exec/s: 39 rss: 68Mb L: 24/25 MS: 1 InsertRepeatedBytes- 00:08:33.614 [2024-07-20 16:17:02.208962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.614 [2024-07-20 16:17:02.208990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.209052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.614 [2024-07-20 16:17:02.209067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.209120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.614 [2024-07-20 16:17:02.209136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.209188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.614 [2024-07-20 16:17:02.209204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.614 #40 NEW cov: 11756 ft: 14346 corp: 36/622b lim: 25 exec/s: 40 rss: 68Mb L: 22/25 MS: 1 CMP- DE: "\001.\362z\265\220\003B"- 00:08:33.614 [2024-07-20 16:17:02.249112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.614 [2024-07-20 16:17:02.249138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.249183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.614 [2024-07-20 16:17:02.249199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.249250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.614 [2024-07-20 16:17:02.249281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.249333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.614 [2024-07-20 16:17:02.249349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.614 #41 NEW cov: 11756 ft: 14398 corp: 37/645b lim: 25 exec/s: 41 rss: 68Mb L: 23/25 MS: 1 InsertRepeatedBytes- 00:08:33.614 [2024-07-20 16:17:02.288925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.614 [2024-07-20 16:17:02.288950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.614 #45 NEW cov: 11756 ft: 14752 corp: 38/653b lim: 25 exec/s: 45 rss: 68Mb L: 8/25 MS: 4 CrossOver-CrossOver-InsertByte-CMP- DE: "\001\000\000\037"- 00:08:33.614 [2024-07-20 16:17:02.329237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.614 [2024-07-20 16:17:02.329264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.329299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.614 [2024-07-20 16:17:02.329315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.329366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.614 [2024-07-20 16:17:02.329382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.614 #46 NEW cov: 11756 ft: 14770 corp: 39/672b lim: 25 exec/s: 46 rss: 68Mb L: 19/25 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\002"- 00:08:33.614 [2024-07-20 16:17:02.369593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.614 [2024-07-20 16:17:02.369619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.369668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.614 [2024-07-20 16:17:02.369682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.369733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.614 [2024-07-20 16:17:02.369747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.369797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.614 [2024-07-20 16:17:02.369812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.369863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:33.614 [2024-07-20 16:17:02.369877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.614 #47 NEW cov: 11756 ft: 14783 corp: 40/697b lim: 25 exec/s: 47 rss: 68Mb L: 25/25 MS: 1 ChangeByte- 00:08:33.614 [2024-07-20 16:17:02.409491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.614 [2024-07-20 16:17:02.409517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.409552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.614 [2024-07-20 16:17:02.409567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.614 [2024-07-20 16:17:02.409618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.614 [2024-07-20 16:17:02.409633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.874 #48 NEW cov: 11756 ft: 14787 corp: 41/712b lim: 25 exec/s: 48 rss: 69Mb L: 15/25 MS: 1 InsertByte- 00:08:33.874 [2024-07-20 16:17:02.449493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.874 [2024-07-20 16:17:02.449521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.874 [2024-07-20 16:17:02.449551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.874 [2024-07-20 16:17:02.449566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.874 #49 NEW cov: 11756 ft: 14796 corp: 42/722b lim: 25 exec/s: 49 rss: 69Mb L: 10/25 MS: 1 EraseBytes- 00:08:33.874 [2024-07-20 16:17:02.479905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.874 [2024-07-20 16:17:02.479931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.874 [2024-07-20 16:17:02.479982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.874 [2024-07-20 16:17:02.479997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.874 [2024-07-20 16:17:02.480048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.874 [2024-07-20 16:17:02.480064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.874 [2024-07-20 16:17:02.480114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.874 [2024-07-20 16:17:02.480129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.874 [2024-07-20 16:17:02.480179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:33.874 [2024-07-20 16:17:02.480194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.874 #50 NEW cov: 11756 ft: 14803 corp: 43/747b lim: 25 exec/s: 25 rss: 69Mb L: 25/25 MS: 1 ChangeByte- 00:08:33.874 #50 DONE cov: 11756 ft: 14803 corp: 43/747b lim: 25 exec/s: 25 rss: 69Mb 00:08:33.874 ###### Recommended dictionary. ###### 00:08:33.874 "\003\000" # Uses: 1 00:08:33.874 "\001\000\002\000" # Uses: 1 00:08:33.874 "\377\377\377\033" # Uses: 0 00:08:33.874 "\001.\362z\265\220\003B" # Uses: 0 00:08:33.874 "\001\000\000\037" # Uses: 0 00:08:33.874 "\001\000\000\000\000\000\000\002" # Uses: 0 00:08:33.874 ###### End of recommended dictionary. ###### 00:08:33.874 Done 50 runs in 2 second(s) 00:08:33.874 16:17:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:33.874 16:17:02 -- ../common.sh@72 -- # (( i++ )) 00:08:33.874 16:17:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.874 16:17:02 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:33.874 16:17:02 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:33.874 16:17:02 -- nvmf/run.sh@24 -- # local timen=1 00:08:33.874 16:17:02 -- nvmf/run.sh@25 -- # local core=0x1 00:08:33.874 16:17:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:33.874 16:17:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:33.874 16:17:02 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:33.874 16:17:02 -- nvmf/run.sh@29 -- # port=4424 00:08:33.874 16:17:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:33.874 16:17:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:33.874 16:17:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:33.874 16:17:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:33.874 [2024-07-20 16:17:02.656750] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:33.874 [2024-07-20 16:17:02.656839] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2280173 ] 00:08:34.133 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.133 [2024-07-20 16:17:02.832549] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.133 [2024-07-20 16:17:02.851822] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:34.133 [2024-07-20 16:17:02.851963] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.133 [2024-07-20 16:17:02.903348] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:34.133 [2024-07-20 16:17:02.919670] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:34.133 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.133 INFO: Seed: 3997226757 00:08:34.392 INFO: Loaded 1 modules (341291 inline 8-bit counters): 341291 [0x266470c, 0x26b7c37), 00:08:34.392 INFO: Loaded 1 PC tables (341291 PCs): 341291 [0x26b7c38,0x2becee8), 00:08:34.392 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:34.392 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.392 #2 INITED exec/s: 0 rss: 60Mb 00:08:34.392 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.392 This may also happen if the target rejected all inputs we tried so far 00:08:34.392 [2024-07-20 16:17:02.979087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.392 [2024-07-20 16:17:02.979118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.392 [2024-07-20 16:17:02.979150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.392 [2024-07-20 16:17:02.979166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.392 [2024-07-20 16:17:02.979218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.392 [2024-07-20 16:17:02.979233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.392 [2024-07-20 16:17:02.979286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.392 [2024-07-20 16:17:02.979303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.650 NEW_FUNC[1/672]: 0x4bd130 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:34.650 NEW_FUNC[2/672]: 0x4cdd90 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:34.650 #9 NEW cov: 11601 ft: 11602 corp: 2/89b lim: 100 exec/s: 0 rss: 66Mb L: 88/88 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:34.650 [2024-07-20 16:17:03.300164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.300224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.300320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137025 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.300351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.300427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.300463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.300540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.300569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.650 #10 NEW cov: 11714 ft: 12152 corp: 3/177b lim: 100 exec/s: 0 rss: 66Mb L: 88/88 MS: 1 ChangeByte- 00:08:34.650 [2024-07-20 16:17:03.350003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.350033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.350073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.350090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.350141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.350156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.350210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.350225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.650 #11 NEW cov: 11720 ft: 12516 corp: 4/265b lim: 100 exec/s: 0 rss: 66Mb L: 88/88 MS: 1 ChangeBit- 00:08:34.650 [2024-07-20 16:17:03.390105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.390134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.390188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.390208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.390263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.390280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.390334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791582492315203 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.390350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.650 #12 NEW cov: 11805 ft: 12773 corp: 5/361b lim: 100 exec/s: 0 rss: 66Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:34.650 [2024-07-20 16:17:03.430238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5425512961765231435 len:19276 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.430266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.430313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.430329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.430383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.430399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.650 [2024-07-20 16:17:03.430455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.650 [2024-07-20 16:17:03.430485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.650 #17 NEW cov: 11805 ft: 12831 corp: 6/452b lim: 100 exec/s: 0 rss: 66Mb L: 91/96 MS: 5 ChangeBit-CrossOver-ShuffleBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:34.908 [2024-07-20 16:17:03.470355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.470384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.470428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.470448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.470500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.470515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.470566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791582492315203 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.470581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.908 #18 NEW cov: 11805 ft: 12920 corp: 7/548b lim: 100 exec/s: 0 rss: 66Mb L: 96/96 MS: 1 ShuffleBytes- 00:08:34.908 [2024-07-20 16:17:03.510177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.510204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.510252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.510268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.908 #19 NEW cov: 11805 ft: 13432 corp: 8/601b lim: 100 exec/s: 0 rss: 66Mb L: 53/96 MS: 1 EraseBytes- 00:08:34.908 [2024-07-20 16:17:03.550609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.550637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.550683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.550699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.550752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791579022655488 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.550786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.550841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.550857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.908 #20 NEW cov: 11805 ft: 13479 corp: 9/693b lim: 100 exec/s: 0 rss: 66Mb L: 92/96 MS: 1 InsertRepeatedBytes- 00:08:34.908 [2024-07-20 16:17:03.590731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846905928404124483 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.590759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.590812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.590827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.908 [2024-07-20 16:17:03.590882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.908 [2024-07-20 16:17:03.590899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.590954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791582492315203 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.590969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.909 #21 NEW cov: 11805 ft: 13484 corp: 10/789b lim: 100 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 ChangeByte- 00:08:34.909 [2024-07-20 16:17:03.630781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.630809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.630863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.630879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.630933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.630947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.631001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:617823738964921923 len:11777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.631017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.909 #22 NEW cov: 11805 ft: 13528 corp: 11/885b lim: 100 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 CMP- DE: "\010\222\363j{\362.\000"- 00:08:34.909 [2024-07-20 16:17:03.670795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.670825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.670864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.670879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.670933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791579022655488 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.670949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.909 #23 NEW cov: 11805 ft: 13857 corp: 12/961b lim: 100 exec/s: 0 rss: 67Mb L: 76/96 MS: 1 EraseBytes- 00:08:34.909 [2024-07-20 16:17:03.711026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.711055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.711093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.711109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.711162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.711179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.909 [2024-07-20 16:17:03.711235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17539967979188586642 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.909 [2024-07-20 16:17:03.711251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.166 #24 NEW cov: 11805 ft: 13896 corp: 13/1049b lim: 100 exec/s: 0 rss: 67Mb L: 88/96 MS: 1 PersAutoDict- DE: "\010\222\363j{\362.\000"- 00:08:35.166 [2024-07-20 16:17:03.750853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.750885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.750940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.750955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.166 #25 NEW cov: 11805 ft: 13906 corp: 14/1102b lim: 100 exec/s: 0 rss: 67Mb L: 53/96 MS: 1 CopyPart- 00:08:35.166 [2024-07-20 16:17:03.790951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5999713083801682755 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.790980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.791050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.791065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.166 #26 NEW cov: 11805 ft: 13924 corp: 15/1155b lim: 100 exec/s: 0 rss: 67Mb L: 53/96 MS: 1 ChangeBit- 00:08:35.166 [2024-07-20 16:17:03.831071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.831098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.831131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137089 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.831146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.166 NEW_FUNC[1/1]: 0x1971680 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:35.166 #27 NEW cov: 11828 ft: 13972 corp: 16/1211b lim: 100 exec/s: 0 rss: 67Mb L: 56/96 MS: 1 CopyPart- 00:08:35.166 [2024-07-20 16:17:03.871172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.871200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.871244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.871258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.166 #28 NEW cov: 11828 ft: 14013 corp: 17/1267b lim: 100 exec/s: 0 rss: 67Mb L: 56/96 MS: 1 CrossOver- 00:08:35.166 [2024-07-20 16:17:03.911600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.911629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.911692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.911708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.911763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.911779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.911835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17539967979188586642 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.911850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.166 #29 NEW cov: 11828 ft: 14036 corp: 18/1355b lim: 100 exec/s: 0 rss: 67Mb L: 88/96 MS: 1 ChangeBinInt- 00:08:35.166 [2024-07-20 16:17:03.951723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846905928404124483 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.951752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.951787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.951803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.951857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.951874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.166 [2024-07-20 16:17:03.951927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:14862797153764298446 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.166 [2024-07-20 16:17:03.951943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.424 #30 NEW cov: 11828 ft: 14054 corp: 19/1453b lim: 100 exec/s: 30 rss: 67Mb L: 98/98 MS: 1 CopyPart- 00:08:35.424 [2024-07-20 16:17:03.991820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846905928404124483 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:03.991849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:03.991892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:03.991909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:03.991963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580152185667 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:03.991980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:03.992034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791582492315203 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:03.992049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.424 #31 NEW cov: 11828 ft: 14138 corp: 20/1549b lim: 100 exec/s: 31 rss: 67Mb L: 96/98 MS: 1 ChangeBit- 00:08:35.424 [2024-07-20 16:17:04.031938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791325791765315 len:27260 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.031967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.032005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.032020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.032076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.032093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.032145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17539967979188586642 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.032163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.424 #32 NEW cov: 11828 ft: 14162 corp: 21/1637b lim: 100 exec/s: 32 rss: 67Mb L: 88/98 MS: 1 PersAutoDict- DE: "\010\222\363j{\362.\000"- 00:08:35.424 [2024-07-20 16:17:04.071785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.071813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.071869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.071886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.424 #33 NEW cov: 11828 ft: 14189 corp: 22/1679b lim: 100 exec/s: 33 rss: 68Mb L: 42/98 MS: 1 EraseBytes- 00:08:35.424 [2024-07-20 16:17:04.111887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835749 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.111915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.111960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4702676392075281219 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.111977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.424 #34 NEW cov: 11828 ft: 14198 corp: 23/1736b lim: 100 exec/s: 34 rss: 68Mb L: 57/98 MS: 1 InsertByte- 00:08:35.424 [2024-07-20 16:17:04.152175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.152205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.152253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137089 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.152268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.152324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.152341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.424 #35 NEW cov: 11828 ft: 14285 corp: 24/1800b lim: 100 exec/s: 35 rss: 68Mb L: 64/98 MS: 1 PersAutoDict- DE: "\010\222\363j{\362.\000"- 00:08:35.424 [2024-07-20 16:17:04.192439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:7152634588408529731 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.192472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.192512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.192527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.192580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.192597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.424 [2024-07-20 16:17:04.192650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:617823738964921923 len:11777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.424 [2024-07-20 16:17:04.192665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.424 #36 NEW cov: 11828 ft: 14292 corp: 25/1896b lim: 100 exec/s: 36 rss: 68Mb L: 96/98 MS: 1 ChangeBit- 00:08:35.683 [2024-07-20 16:17:04.232563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.232596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.232628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.232642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.232698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.232713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.232766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4830272176199225038 len:61999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.232782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.683 #37 NEW cov: 11828 ft: 14308 corp: 26/1993b lim: 100 exec/s: 37 rss: 68Mb L: 97/98 MS: 1 InsertByte- 00:08:35.683 [2024-07-20 16:17:04.272482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.272511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.272550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.272567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.272620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791579022655488 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.272637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.683 #38 NEW cov: 11828 ft: 14357 corp: 27/2069b lim: 100 exec/s: 38 rss: 68Mb L: 76/98 MS: 1 ShuffleBytes- 00:08:35.683 [2024-07-20 16:17:04.312635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.312663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.312711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.312729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.312784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791579022655488 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.312801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.683 #39 NEW cov: 11828 ft: 14372 corp: 28/2145b lim: 100 exec/s: 39 rss: 68Mb L: 76/98 MS: 1 ChangeBinInt- 00:08:35.683 [2024-07-20 16:17:04.352893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.352921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.352959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.352975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.353027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.353043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.353097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.353113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.683 #40 NEW cov: 11828 ft: 14393 corp: 29/2233b lim: 100 exec/s: 40 rss: 68Mb L: 88/98 MS: 1 ShuffleBytes- 00:08:35.683 [2024-07-20 16:17:04.393093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.393121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.393171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17303 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.393187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.393256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.393270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.393325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:14902075007643340494 len:62315 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.393341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.393397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.393413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.683 #41 NEW cov: 11828 ft: 14444 corp: 30/2333b lim: 100 exec/s: 41 rss: 68Mb L: 100/100 MS: 1 CrossOver- 00:08:35.683 [2024-07-20 16:17:04.432797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.432826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.432880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.432894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.683 #42 NEW cov: 11828 ft: 14460 corp: 31/2386b lim: 100 exec/s: 42 rss: 68Mb L: 53/100 MS: 1 ChangeBinInt- 00:08:35.683 [2024-07-20 16:17:04.462929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.462957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.683 [2024-07-20 16:17:04.463027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.683 [2024-07-20 16:17:04.463043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.942 #43 NEW cov: 11828 ft: 14482 corp: 32/2439b lim: 100 exec/s: 43 rss: 68Mb L: 53/100 MS: 1 CopyPart- 00:08:35.942 [2024-07-20 16:17:04.503330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.942 [2024-07-20 16:17:04.503360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.942 [2024-07-20 16:17:04.503404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.942 [2024-07-20 16:17:04.503421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.503492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791579022655488 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.503508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.503564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.503580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.943 #44 NEW cov: 11828 ft: 14512 corp: 33/2531b lim: 100 exec/s: 44 rss: 69Mb L: 92/100 MS: 1 CrossOver- 00:08:35.943 [2024-07-20 16:17:04.543466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.543494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.543558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.543575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.543631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791579022655488 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.543651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.543706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.543721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.943 #45 NEW cov: 11828 ft: 14515 corp: 34/2623b lim: 100 exec/s: 45 rss: 69Mb L: 92/100 MS: 1 ChangeBinInt- 00:08:35.943 [2024-07-20 16:17:04.583735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.583762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.583835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17303 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.583851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.583906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.583922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.583975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4886070033371811534 len:62315 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.583992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.584045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.584061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.943 #46 NEW cov: 11828 ft: 14528 corp: 35/2723b lim: 100 exec/s: 46 rss: 69Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:35.943 [2024-07-20 16:17:04.623733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846905928404124483 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.623761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.623825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.623842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.623895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.623909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.623960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791582492315203 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.623977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.943 #47 NEW cov: 11828 ft: 14545 corp: 36/2819b lim: 100 exec/s: 47 rss: 69Mb L: 96/100 MS: 1 ChangeBinInt- 00:08:35.943 [2024-07-20 16:17:04.663872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.663903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.663940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.663956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.664012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.664028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.664084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17539967979188586642 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.664099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.943 #48 NEW cov: 11828 ft: 14564 corp: 37/2916b lim: 100 exec/s: 48 rss: 69Mb L: 97/100 MS: 1 InsertRepeatedBytes- 00:08:35.943 [2024-07-20 16:17:04.703649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.703676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.703743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.703759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.943 #49 NEW cov: 11828 ft: 14575 corp: 38/2963b lim: 100 exec/s: 49 rss: 69Mb L: 47/100 MS: 1 EraseBytes- 00:08:35.943 [2024-07-20 16:17:04.744267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.744295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.744344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17303 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.744360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.744413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.744430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.744490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4886070033371811534 len:62315 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.744506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.943 [2024-07-20 16:17:04.744558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.943 [2024-07-20 16:17:04.744573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.202 #50 NEW cov: 11828 ft: 14582 corp: 39/3063b lim: 100 exec/s: 50 rss: 69Mb L: 100/100 MS: 1 ChangeByte- 00:08:36.203 [2024-07-20 16:17:04.783979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846791579194835779 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.784011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.784060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.784077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.784131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.784147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.203 #51 NEW cov: 11828 ft: 14617 corp: 40/3125b lim: 100 exec/s: 51 rss: 69Mb L: 62/100 MS: 1 EraseBytes- 00:08:36.203 [2024-07-20 16:17:04.824245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4855798778449576771 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.824273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.824332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.824349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.824403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.824418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.824474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791582492315203 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.824490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.203 #52 NEW cov: 11828 ft: 14654 corp: 41/3221b lim: 100 exec/s: 52 rss: 69Mb L: 96/100 MS: 1 ChangeBit- 00:08:36.203 [2024-07-20 16:17:04.864046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846826763563189059 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.864074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.864123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.864140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.203 #58 NEW cov: 11828 ft: 14698 corp: 42/3270b lim: 100 exec/s: 58 rss: 69Mb L: 49/100 MS: 1 CrossOver- 00:08:36.203 [2024-07-20 16:17:04.904477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5999713083801682755 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.904505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.904569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580117582659 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.904587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.904644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.904661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.904716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.904732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.203 #59 NEW cov: 11828 ft: 14721 corp: 43/3354b lim: 100 exec/s: 59 rss: 70Mb L: 84/100 MS: 1 CopyPart- 00:08:36.203 [2024-07-20 16:17:04.944591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:7152634588408529731 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.944619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.944679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.944696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.944749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4846791580151137091 len:17220 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.944766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.203 [2024-07-20 16:17:04.944821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:617823738964921923 len:11777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.203 [2024-07-20 16:17:04.944838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.203 #60 NEW cov: 11828 ft: 14724 corp: 44/3450b lim: 100 exec/s: 30 rss: 70Mb L: 96/100 MS: 1 CMP- DE: "\005\000\000\000"- 00:08:36.203 #60 DONE cov: 11828 ft: 14724 corp: 44/3450b lim: 100 exec/s: 30 rss: 70Mb 00:08:36.203 ###### Recommended dictionary. ###### 00:08:36.203 "\010\222\363j{\362.\000" # Uses: 3 00:08:36.203 "\005\000\000\000" # Uses: 0 00:08:36.203 ###### End of recommended dictionary. ###### 00:08:36.203 Done 60 runs in 2 second(s) 00:08:36.466 16:17:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:36.466 16:17:05 -- ../common.sh@72 -- # (( i++ )) 00:08:36.466 16:17:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.466 16:17:05 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:36.466 00:08:36.466 real 1m2.238s 00:08:36.466 user 1m38.506s 00:08:36.466 sys 0m7.179s 00:08:36.466 16:17:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.466 16:17:05 -- common/autotest_common.sh@10 -- # set +x 00:08:36.466 ************************************ 00:08:36.466 END TEST nvmf_fuzz 00:08:36.466 ************************************ 00:08:36.466 16:17:05 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:36.466 16:17:05 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:36.466 16:17:05 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:36.466 16:17:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:36.466 16:17:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:36.466 16:17:05 -- common/autotest_common.sh@10 -- # set +x 00:08:36.466 ************************************ 00:08:36.466 START TEST vfio_fuzz 00:08:36.466 ************************************ 00:08:36.466 16:17:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:36.466 * Looking for test storage... 00:08:36.466 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.466 16:17:05 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:36.466 16:17:05 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:36.466 16:17:05 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:36.466 16:17:05 -- common/autotest_common.sh@34 -- # set -e 00:08:36.466 16:17:05 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:36.466 16:17:05 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:36.466 16:17:05 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:36.466 16:17:05 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:36.466 16:17:05 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:36.466 16:17:05 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:36.466 16:17:05 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:36.466 16:17:05 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:36.466 16:17:05 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:36.466 16:17:05 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:36.466 16:17:05 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:36.466 16:17:05 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:36.466 16:17:05 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:36.466 16:17:05 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:36.466 16:17:05 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:36.466 16:17:05 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:36.466 16:17:05 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:36.466 16:17:05 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:36.466 16:17:05 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:36.466 16:17:05 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:36.466 16:17:05 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:36.466 16:17:05 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:36.466 16:17:05 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:36.466 16:17:05 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:36.466 16:17:05 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:36.466 16:17:05 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:36.466 16:17:05 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:36.466 16:17:05 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:36.466 16:17:05 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:36.466 16:17:05 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:36.466 16:17:05 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:36.466 16:17:05 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:36.466 16:17:05 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:36.466 16:17:05 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:36.466 16:17:05 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:36.466 16:17:05 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:36.466 16:17:05 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:36.466 16:17:05 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:36.466 16:17:05 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:36.466 16:17:05 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.466 16:17:05 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:36.466 16:17:05 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:36.466 16:17:05 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:36.466 16:17:05 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:36.466 16:17:05 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:36.466 16:17:05 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:36.466 16:17:05 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:36.466 16:17:05 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:36.466 16:17:05 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:36.466 16:17:05 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:36.466 16:17:05 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:36.466 16:17:05 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:36.466 16:17:05 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:36.466 16:17:05 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:36.466 16:17:05 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:36.466 16:17:05 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:36.466 16:17:05 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:36.466 16:17:05 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:36.467 16:17:05 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:36.467 16:17:05 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:36.467 16:17:05 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:36.467 16:17:05 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:36.467 16:17:05 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:36.467 16:17:05 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:36.467 16:17:05 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.467 16:17:05 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:36.467 16:17:05 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:36.467 16:17:05 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:36.467 16:17:05 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:36.467 16:17:05 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:36.467 16:17:05 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:36.467 16:17:05 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:36.467 16:17:05 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:36.467 16:17:05 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:36.467 16:17:05 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:36.467 16:17:05 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:36.467 16:17:05 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:36.467 16:17:05 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:36.467 16:17:05 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:36.467 16:17:05 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:36.467 16:17:05 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:36.467 16:17:05 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:36.467 16:17:05 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:36.467 16:17:05 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:36.467 16:17:05 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:36.467 16:17:05 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:36.467 16:17:05 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:36.467 16:17:05 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:36.467 16:17:05 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.467 16:17:05 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:36.467 16:17:05 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.467 16:17:05 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:36.467 16:17:05 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:36.467 16:17:05 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:36.467 16:17:05 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:36.467 16:17:05 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:36.467 16:17:05 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:36.467 16:17:05 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:36.467 16:17:05 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:36.467 #define SPDK_CONFIG_H 00:08:36.467 #define SPDK_CONFIG_APPS 1 00:08:36.467 #define SPDK_CONFIG_ARCH native 00:08:36.467 #undef SPDK_CONFIG_ASAN 00:08:36.467 #undef SPDK_CONFIG_AVAHI 00:08:36.467 #undef SPDK_CONFIG_CET 00:08:36.467 #define SPDK_CONFIG_COVERAGE 1 00:08:36.467 #define SPDK_CONFIG_CROSS_PREFIX 00:08:36.467 #undef SPDK_CONFIG_CRYPTO 00:08:36.467 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:36.467 #undef SPDK_CONFIG_CUSTOMOCF 00:08:36.467 #undef SPDK_CONFIG_DAOS 00:08:36.467 #define SPDK_CONFIG_DAOS_DIR 00:08:36.467 #define SPDK_CONFIG_DEBUG 1 00:08:36.467 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:36.467 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.467 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:36.467 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.467 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:36.467 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:36.467 #define SPDK_CONFIG_EXAMPLES 1 00:08:36.467 #undef SPDK_CONFIG_FC 00:08:36.467 #define SPDK_CONFIG_FC_PATH 00:08:36.467 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:36.467 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:36.467 #undef SPDK_CONFIG_FUSE 00:08:36.467 #define SPDK_CONFIG_FUZZER 1 00:08:36.467 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:36.467 #undef SPDK_CONFIG_GOLANG 00:08:36.467 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:36.467 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:36.467 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:36.467 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:36.467 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:36.467 #define SPDK_CONFIG_IDXD 1 00:08:36.467 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:36.467 #undef SPDK_CONFIG_IPSEC_MB 00:08:36.467 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:36.467 #define SPDK_CONFIG_ISAL 1 00:08:36.467 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:36.467 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:36.467 #define SPDK_CONFIG_LIBDIR 00:08:36.467 #undef SPDK_CONFIG_LTO 00:08:36.467 #define SPDK_CONFIG_MAX_LCORES 00:08:36.467 #define SPDK_CONFIG_NVME_CUSE 1 00:08:36.467 #undef SPDK_CONFIG_OCF 00:08:36.467 #define SPDK_CONFIG_OCF_PATH 00:08:36.467 #define SPDK_CONFIG_OPENSSL_PATH 00:08:36.467 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:36.467 #undef SPDK_CONFIG_PGO_USE 00:08:36.467 #define SPDK_CONFIG_PREFIX /usr/local 00:08:36.467 #undef SPDK_CONFIG_RAID5F 00:08:36.467 #undef SPDK_CONFIG_RBD 00:08:36.467 #define SPDK_CONFIG_RDMA 1 00:08:36.467 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:36.467 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:36.467 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:36.467 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:36.467 #undef SPDK_CONFIG_SHARED 00:08:36.467 #undef SPDK_CONFIG_SMA 00:08:36.467 #define SPDK_CONFIG_TESTS 1 00:08:36.467 #undef SPDK_CONFIG_TSAN 00:08:36.467 #define SPDK_CONFIG_UBLK 1 00:08:36.467 #define SPDK_CONFIG_UBSAN 1 00:08:36.467 #undef SPDK_CONFIG_UNIT_TESTS 00:08:36.467 #undef SPDK_CONFIG_URING 00:08:36.467 #define SPDK_CONFIG_URING_PATH 00:08:36.467 #undef SPDK_CONFIG_URING_ZNS 00:08:36.467 #undef SPDK_CONFIG_USDT 00:08:36.467 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:36.467 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:36.467 #define SPDK_CONFIG_VFIO_USER 1 00:08:36.467 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:36.467 #define SPDK_CONFIG_VHOST 1 00:08:36.467 #define SPDK_CONFIG_VIRTIO 1 00:08:36.467 #undef SPDK_CONFIG_VTUNE 00:08:36.467 #define SPDK_CONFIG_VTUNE_DIR 00:08:36.467 #define SPDK_CONFIG_WERROR 1 00:08:36.467 #define SPDK_CONFIG_WPDK_DIR 00:08:36.467 #undef SPDK_CONFIG_XNVME 00:08:36.467 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:36.798 16:17:05 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:36.798 16:17:05 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:36.798 16:17:05 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:36.798 16:17:05 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:36.798 16:17:05 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:36.798 16:17:05 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.798 16:17:05 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.798 16:17:05 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.798 16:17:05 -- paths/export.sh@5 -- # export PATH 00:08:36.798 16:17:05 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.798 16:17:05 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:36.798 16:17:05 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:36.798 16:17:05 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:36.798 16:17:05 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:36.798 16:17:05 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:36.798 16:17:05 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:36.798 16:17:05 -- pm/common@16 -- # TEST_TAG=N/A 00:08:36.798 16:17:05 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:36.798 16:17:05 -- common/autotest_common.sh@52 -- # : 1 00:08:36.798 16:17:05 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:36.798 16:17:05 -- common/autotest_common.sh@56 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:36.798 16:17:05 -- common/autotest_common.sh@58 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:36.798 16:17:05 -- common/autotest_common.sh@60 -- # : 1 00:08:36.798 16:17:05 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:36.798 16:17:05 -- common/autotest_common.sh@62 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:36.798 16:17:05 -- common/autotest_common.sh@64 -- # : 00:08:36.798 16:17:05 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:36.798 16:17:05 -- common/autotest_common.sh@66 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:36.798 16:17:05 -- common/autotest_common.sh@68 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:36.798 16:17:05 -- common/autotest_common.sh@70 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:36.798 16:17:05 -- common/autotest_common.sh@72 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:36.798 16:17:05 -- common/autotest_common.sh@74 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:36.798 16:17:05 -- common/autotest_common.sh@76 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:36.798 16:17:05 -- common/autotest_common.sh@78 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:36.798 16:17:05 -- common/autotest_common.sh@80 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:36.798 16:17:05 -- common/autotest_common.sh@82 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:36.798 16:17:05 -- common/autotest_common.sh@84 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:36.798 16:17:05 -- common/autotest_common.sh@86 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:36.798 16:17:05 -- common/autotest_common.sh@88 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:36.798 16:17:05 -- common/autotest_common.sh@90 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:36.798 16:17:05 -- common/autotest_common.sh@92 -- # : 1 00:08:36.798 16:17:05 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:36.798 16:17:05 -- common/autotest_common.sh@94 -- # : 1 00:08:36.798 16:17:05 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:36.798 16:17:05 -- common/autotest_common.sh@96 -- # : rdma 00:08:36.798 16:17:05 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:36.798 16:17:05 -- common/autotest_common.sh@98 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:36.798 16:17:05 -- common/autotest_common.sh@100 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:36.798 16:17:05 -- common/autotest_common.sh@102 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:36.798 16:17:05 -- common/autotest_common.sh@104 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:36.798 16:17:05 -- common/autotest_common.sh@106 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:36.798 16:17:05 -- common/autotest_common.sh@108 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:36.798 16:17:05 -- common/autotest_common.sh@110 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:36.798 16:17:05 -- common/autotest_common.sh@112 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:36.798 16:17:05 -- common/autotest_common.sh@114 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:36.798 16:17:05 -- common/autotest_common.sh@116 -- # : 1 00:08:36.798 16:17:05 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:36.798 16:17:05 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.798 16:17:05 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:36.798 16:17:05 -- common/autotest_common.sh@120 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:36.798 16:17:05 -- common/autotest_common.sh@122 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:36.798 16:17:05 -- common/autotest_common.sh@124 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:36.798 16:17:05 -- common/autotest_common.sh@126 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:36.798 16:17:05 -- common/autotest_common.sh@128 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:36.798 16:17:05 -- common/autotest_common.sh@130 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:36.798 16:17:05 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:36.798 16:17:05 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:36.798 16:17:05 -- common/autotest_common.sh@134 -- # : true 00:08:36.798 16:17:05 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:36.798 16:17:05 -- common/autotest_common.sh@136 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:36.798 16:17:05 -- common/autotest_common.sh@138 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:36.798 16:17:05 -- common/autotest_common.sh@140 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:36.798 16:17:05 -- common/autotest_common.sh@142 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:36.798 16:17:05 -- common/autotest_common.sh@144 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:36.798 16:17:05 -- common/autotest_common.sh@146 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:36.798 16:17:05 -- common/autotest_common.sh@148 -- # : 00:08:36.798 16:17:05 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:36.798 16:17:05 -- common/autotest_common.sh@150 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:36.798 16:17:05 -- common/autotest_common.sh@152 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:36.798 16:17:05 -- common/autotest_common.sh@154 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:36.798 16:17:05 -- common/autotest_common.sh@156 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:36.798 16:17:05 -- common/autotest_common.sh@158 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:36.798 16:17:05 -- common/autotest_common.sh@160 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:36.798 16:17:05 -- common/autotest_common.sh@163 -- # : 00:08:36.798 16:17:05 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:36.798 16:17:05 -- common/autotest_common.sh@165 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:36.798 16:17:05 -- common/autotest_common.sh@167 -- # : 0 00:08:36.798 16:17:05 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:36.798 16:17:05 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:36.798 16:17:05 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:36.798 16:17:05 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.798 16:17:05 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.798 16:17:05 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.798 16:17:05 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.798 16:17:05 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.798 16:17:05 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.798 16:17:05 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:36.798 16:17:05 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:36.798 16:17:05 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:36.798 16:17:05 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:36.798 16:17:05 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:36.798 16:17:05 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:36.798 16:17:05 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:36.798 16:17:05 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:36.798 16:17:05 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:36.798 16:17:05 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:36.798 16:17:05 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:36.798 16:17:05 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:36.798 16:17:05 -- common/autotest_common.sh@196 -- # cat 00:08:36.798 16:17:05 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:36.798 16:17:05 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:36.798 16:17:05 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:36.798 16:17:05 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:36.798 16:17:05 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:36.798 16:17:05 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:36.798 16:17:05 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:36.798 16:17:05 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.798 16:17:05 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.798 16:17:05 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.798 16:17:05 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.798 16:17:05 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:36.798 16:17:05 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:36.798 16:17:05 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:36.798 16:17:05 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:36.798 16:17:05 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:36.798 16:17:05 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:36.798 16:17:05 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:36.798 16:17:05 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:36.798 16:17:05 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:36.798 16:17:05 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:36.798 16:17:05 -- common/autotest_common.sh@249 -- # valgrind= 00:08:36.798 16:17:05 -- common/autotest_common.sh@255 -- # uname -s 00:08:36.798 16:17:05 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:36.798 16:17:05 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:36.798 16:17:05 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:36.798 16:17:05 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:36.798 16:17:05 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:36.798 16:17:05 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:36.798 16:17:05 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:36.798 16:17:05 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:36.798 16:17:05 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:36.798 16:17:05 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:36.798 16:17:05 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:36.798 16:17:05 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:36.798 16:17:05 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:36.798 16:17:05 -- common/autotest_common.sh@309 -- # [[ -z 2280502 ]] 00:08:36.798 16:17:05 -- common/autotest_common.sh@309 -- # kill -0 2280502 00:08:36.798 16:17:05 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:36.798 16:17:05 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:36.799 16:17:05 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:36.799 16:17:05 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:36.799 16:17:05 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:36.799 16:17:05 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:36.799 16:17:05 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:36.799 16:17:05 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:36.799 16:17:05 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.nkdVxm 00:08:36.799 16:17:05 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:36.799 16:17:05 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:36.799 16:17:05 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:36.799 16:17:05 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.nkdVxm/tests/vfio /tmp/spdk.nkdVxm 00:08:36.799 16:17:05 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:36.799 16:17:05 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:36.799 16:17:05 -- common/autotest_common.sh@318 -- # df -T 00:08:36.799 16:17:05 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:36.799 16:17:05 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:36.799 16:17:05 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:36.799 16:17:05 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:08:36.799 16:17:05 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # avails["$mount"]=52131692544 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:08:36.799 16:17:05 -- common/autotest_common.sh@354 -- # uses["$mount"]=9610625024 00:08:36.799 16:17:05 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:36.799 16:17:05 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:08:36.799 16:17:05 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342493184 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:08:36.799 16:17:05 -- common/autotest_common.sh@354 -- # uses["$mount"]=5971968 00:08:36.799 16:17:05 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868783104 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:36.799 16:17:05 -- common/autotest_common.sh@354 -- # uses["$mount"]=2375680 00:08:36.799 16:17:05 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:08:36.799 16:17:05 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:08:36.799 16:17:05 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:08:36.799 16:17:05 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:36.799 16:17:05 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:36.799 * Looking for test storage... 00:08:36.799 16:17:05 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:36.799 16:17:05 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:36.799 16:17:05 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.799 16:17:05 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:36.799 16:17:05 -- common/autotest_common.sh@363 -- # mount=/ 00:08:36.799 16:17:05 -- common/autotest_common.sh@365 -- # target_space=52131692544 00:08:36.799 16:17:05 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:36.799 16:17:05 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:36.799 16:17:05 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:36.799 16:17:05 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:36.799 16:17:05 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:36.799 16:17:05 -- common/autotest_common.sh@372 -- # new_size=11825217536 00:08:36.799 16:17:05 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:36.799 16:17:05 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.799 16:17:05 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.799 16:17:05 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.799 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.799 16:17:05 -- common/autotest_common.sh@380 -- # return 0 00:08:36.799 16:17:05 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:36.799 16:17:05 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:36.799 16:17:05 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:36.799 16:17:05 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:36.799 16:17:05 -- common/autotest_common.sh@1672 -- # true 00:08:36.799 16:17:05 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:36.799 16:17:05 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:36.799 16:17:05 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:36.799 16:17:05 -- common/autotest_common.sh@27 -- # exec 00:08:36.799 16:17:05 -- common/autotest_common.sh@29 -- # exec 00:08:36.799 16:17:05 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:36.799 16:17:05 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:36.799 16:17:05 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:36.799 16:17:05 -- common/autotest_common.sh@18 -- # set -x 00:08:36.799 16:17:05 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:36.799 16:17:05 -- ../common.sh@8 -- # pids=() 00:08:36.799 16:17:05 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:36.799 16:17:05 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:36.799 16:17:05 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:36.799 16:17:05 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:36.799 16:17:05 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:36.799 16:17:05 -- vfio/run.sh@65 -- # mem_size=0 00:08:36.799 16:17:05 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:36.799 16:17:05 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:36.799 16:17:05 -- ../common.sh@69 -- # local fuzz_num=7 00:08:36.799 16:17:05 -- ../common.sh@70 -- # local time=1 00:08:36.799 16:17:05 -- ../common.sh@72 -- # (( i = 0 )) 00:08:36.799 16:17:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.799 16:17:05 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:36.799 16:17:05 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:36.799 16:17:05 -- vfio/run.sh@23 -- # local timen=1 00:08:36.799 16:17:05 -- vfio/run.sh@24 -- # local core=0x1 00:08:36.799 16:17:05 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:36.799 16:17:05 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:36.799 16:17:05 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:36.799 16:17:05 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:36.799 16:17:05 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:36.799 16:17:05 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:36.799 16:17:05 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:36.799 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:36.799 16:17:05 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:36.799 [2024-07-20 16:17:05.444714] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:36.799 [2024-07-20 16:17:05.444786] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2280640 ] 00:08:36.799 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.799 [2024-07-20 16:17:05.517013] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.799 [2024-07-20 16:17:05.554419] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:36.799 [2024-07-20 16:17:05.554575] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.058 INFO: Running with entropic power schedule (0xFF, 100). 00:08:37.058 INFO: Seed: 2499286697 00:08:37.058 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:37.058 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:37.058 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:37.058 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.058 #2 INITED exec/s: 0 rss: 60Mb 00:08:37.058 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.058 This may also happen if the target rejected all inputs we tried so far 00:08:37.574 NEW_FUNC[1/631]: 0x491220 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:37.574 NEW_FUNC[2/631]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:37.574 #3 NEW cov: 10709 ft: 10647 corp: 2/59b lim: 60 exec/s: 0 rss: 67Mb L: 58/58 MS: 1 InsertRepeatedBytes- 00:08:37.832 #4 NEW cov: 10723 ft: 13555 corp: 3/117b lim: 60 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 ChangeBinInt- 00:08:37.832 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:37.832 #5 NEW cov: 10743 ft: 14892 corp: 4/146b lim: 60 exec/s: 0 rss: 69Mb L: 29/58 MS: 1 EraseBytes- 00:08:38.091 #11 NEW cov: 10743 ft: 15769 corp: 5/186b lim: 60 exec/s: 0 rss: 69Mb L: 40/58 MS: 1 EraseBytes- 00:08:38.349 #12 NEW cov: 10743 ft: 16184 corp: 6/234b lim: 60 exec/s: 12 rss: 69Mb L: 48/58 MS: 1 CopyPart- 00:08:38.349 #13 NEW cov: 10743 ft: 16504 corp: 7/292b lim: 60 exec/s: 13 rss: 69Mb L: 58/58 MS: 1 ChangeBit- 00:08:38.607 #14 NEW cov: 10743 ft: 16693 corp: 8/350b lim: 60 exec/s: 14 rss: 69Mb L: 58/58 MS: 1 ChangeBinInt- 00:08:38.866 #15 NEW cov: 10743 ft: 17315 corp: 9/390b lim: 60 exec/s: 15 rss: 70Mb L: 40/58 MS: 1 EraseBytes- 00:08:38.866 #16 NEW cov: 10750 ft: 17433 corp: 10/419b lim: 60 exec/s: 16 rss: 70Mb L: 29/58 MS: 1 CopyPart- 00:08:39.125 #17 NEW cov: 10750 ft: 17605 corp: 11/448b lim: 60 exec/s: 8 rss: 70Mb L: 29/58 MS: 1 ChangeBit- 00:08:39.125 #17 DONE cov: 10750 ft: 17605 corp: 11/448b lim: 60 exec/s: 8 rss: 70Mb 00:08:39.125 Done 17 runs in 2 second(s) 00:08:39.384 16:17:08 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:39.384 16:17:08 -- ../common.sh@72 -- # (( i++ )) 00:08:39.384 16:17:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.384 16:17:08 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:39.384 16:17:08 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:39.384 16:17:08 -- vfio/run.sh@23 -- # local timen=1 00:08:39.384 16:17:08 -- vfio/run.sh@24 -- # local core=0x1 00:08:39.384 16:17:08 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:39.384 16:17:08 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:39.384 16:17:08 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:39.384 16:17:08 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:39.384 16:17:08 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:39.384 16:17:08 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:39.384 16:17:08 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:39.384 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:39.384 16:17:08 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:39.384 [2024-07-20 16:17:08.091159] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:39.384 [2024-07-20 16:17:08.091258] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2281090 ] 00:08:39.384 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.384 [2024-07-20 16:17:08.164061] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.642 [2024-07-20 16:17:08.199218] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:39.642 [2024-07-20 16:17:08.199378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.642 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.642 INFO: Seed: 845292102 00:08:39.642 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:39.642 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:39.642 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:39.642 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.642 #2 INITED exec/s: 0 rss: 60Mb 00:08:39.642 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.642 This may also happen if the target rejected all inputs we tried so far 00:08:39.900 [2024-07-20 16:17:08.496481] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.900 [2024-07-20 16:17:08.496526] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.900 [2024-07-20 16:17:08.496544] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.159 NEW_FUNC[1/638]: 0x4917c0 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:40.159 NEW_FUNC[2/638]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:40.159 #5 NEW cov: 10727 ft: 10607 corp: 2/9b lim: 40 exec/s: 0 rss: 67Mb L: 8/8 MS: 3 CopyPart-CopyPart-InsertRepeatedBytes- 00:08:40.159 [2024-07-20 16:17:08.957542] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.159 [2024-07-20 16:17:08.957579] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.160 [2024-07-20 16:17:08.957597] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.418 #6 NEW cov: 10741 ft: 14467 corp: 3/17b lim: 40 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 CrossOver- 00:08:40.418 [2024-07-20 16:17:09.142609] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.418 [2024-07-20 16:17:09.142633] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.418 [2024-07-20 16:17:09.142650] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.677 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:40.677 #7 NEW cov: 10758 ft: 15700 corp: 4/25b lim: 40 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 ChangeByte- 00:08:40.677 [2024-07-20 16:17:09.327698] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.677 [2024-07-20 16:17:09.327720] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.677 [2024-07-20 16:17:09.327737] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.677 #8 NEW cov: 10758 ft: 16136 corp: 5/33b lim: 40 exec/s: 8 rss: 69Mb L: 8/8 MS: 1 CrossOver- 00:08:40.936 [2024-07-20 16:17:09.514492] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.936 [2024-07-20 16:17:09.514514] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.936 [2024-07-20 16:17:09.514532] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.936 #14 NEW cov: 10758 ft: 16410 corp: 6/41b lim: 40 exec/s: 14 rss: 69Mb L: 8/8 MS: 1 ChangeBit- 00:08:40.936 [2024-07-20 16:17:09.701301] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.936 [2024-07-20 16:17:09.701323] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.936 [2024-07-20 16:17:09.701341] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.194 #15 NEW cov: 10758 ft: 16464 corp: 7/50b lim: 40 exec/s: 15 rss: 69Mb L: 9/9 MS: 1 InsertByte- 00:08:41.194 [2024-07-20 16:17:09.889186] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.194 [2024-07-20 16:17:09.889210] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.194 [2024-07-20 16:17:09.889228] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.452 #16 NEW cov: 10758 ft: 16478 corp: 8/55b lim: 40 exec/s: 16 rss: 69Mb L: 5/9 MS: 1 EraseBytes- 00:08:41.452 [2024-07-20 16:17:10.090763] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.452 [2024-07-20 16:17:10.090792] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.452 [2024-07-20 16:17:10.090811] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.452 #22 NEW cov: 10758 ft: 16521 corp: 9/63b lim: 40 exec/s: 22 rss: 69Mb L: 8/9 MS: 1 ChangeByte- 00:08:41.711 [2024-07-20 16:17:10.282411] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.711 [2024-07-20 16:17:10.282435] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.711 [2024-07-20 16:17:10.282461] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.711 #23 NEW cov: 10765 ft: 16688 corp: 10/87b lim: 40 exec/s: 23 rss: 69Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:41.711 [2024-07-20 16:17:10.474319] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.711 [2024-07-20 16:17:10.474342] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.711 [2024-07-20 16:17:10.474360] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.969 #29 NEW cov: 10765 ft: 16837 corp: 11/95b lim: 40 exec/s: 14 rss: 69Mb L: 8/24 MS: 1 CopyPart- 00:08:41.969 #29 DONE cov: 10765 ft: 16837 corp: 11/95b lim: 40 exec/s: 14 rss: 69Mb 00:08:41.969 Done 29 runs in 2 second(s) 00:08:42.228 16:17:10 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:42.228 16:17:10 -- ../common.sh@72 -- # (( i++ )) 00:08:42.228 16:17:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.228 16:17:10 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:42.228 16:17:10 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:42.228 16:17:10 -- vfio/run.sh@23 -- # local timen=1 00:08:42.229 16:17:10 -- vfio/run.sh@24 -- # local core=0x1 00:08:42.229 16:17:10 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:42.229 16:17:10 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:42.229 16:17:10 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:42.229 16:17:10 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:42.229 16:17:10 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:42.229 16:17:10 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:42.229 16:17:10 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:42.229 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:42.229 16:17:10 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:42.229 [2024-07-20 16:17:10.886912] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:42.229 [2024-07-20 16:17:10.886985] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2281634 ] 00:08:42.229 EAL: No free 2048 kB hugepages reported on node 1 00:08:42.229 [2024-07-20 16:17:10.957679] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.229 [2024-07-20 16:17:10.992782] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:42.229 [2024-07-20 16:17:10.992926] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.488 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.488 INFO: Seed: 3639288175 00:08:42.488 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:42.488 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:42.488 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:42.488 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.488 #2 INITED exec/s: 0 rss: 60Mb 00:08:42.488 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.488 This may also happen if the target rejected all inputs we tried so far 00:08:42.488 [2024-07-20 16:17:11.267308] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.003 NEW_FUNC[1/636]: 0x4921a0 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:43.003 NEW_FUNC[2/636]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:43.003 #3 NEW cov: 10707 ft: 10614 corp: 2/10b lim: 80 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 CMP- DE: "/N\273\317\177\362.\000"- 00:08:43.003 [2024-07-20 16:17:11.726110] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.261 #4 NEW cov: 10721 ft: 14297 corp: 3/19b lim: 80 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeBit- 00:08:43.261 [2024-07-20 16:17:11.902028] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.261 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:43.261 #5 NEW cov: 10738 ft: 15094 corp: 4/28b lim: 80 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 PersAutoDict- DE: "/N\273\317\177\362.\000"- 00:08:43.519 [2024-07-20 16:17:12.080102] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.519 #6 NEW cov: 10738 ft: 15157 corp: 5/37b lim: 80 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 PersAutoDict- DE: "/N\273\317\177\362.\000"- 00:08:43.519 [2024-07-20 16:17:12.253097] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.777 #7 NEW cov: 10738 ft: 15628 corp: 6/46b lim: 80 exec/s: 7 rss: 69Mb L: 9/9 MS: 1 CopyPart- 00:08:43.777 [2024-07-20 16:17:12.427549] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.777 #8 NEW cov: 10738 ft: 15731 corp: 7/55b lim: 80 exec/s: 8 rss: 69Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:44.036 [2024-07-20 16:17:12.605556] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:44.036 #9 NEW cov: 10738 ft: 15925 corp: 8/64b lim: 80 exec/s: 9 rss: 69Mb L: 9/9 MS: 1 ChangeBit- 00:08:44.036 [2024-07-20 16:17:12.782377] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:44.294 #10 NEW cov: 10738 ft: 16817 corp: 9/73b lim: 80 exec/s: 10 rss: 69Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\005U@\351"- 00:08:44.294 [2024-07-20 16:17:12.962348] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:44.294 #11 NEW cov: 10745 ft: 17054 corp: 10/82b lim: 80 exec/s: 11 rss: 69Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:44.553 [2024-07-20 16:17:13.135128] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:44.553 #12 NEW cov: 10745 ft: 17265 corp: 11/91b lim: 80 exec/s: 6 rss: 69Mb L: 9/9 MS: 1 ChangeByte- 00:08:44.553 #12 DONE cov: 10745 ft: 17265 corp: 11/91b lim: 80 exec/s: 6 rss: 69Mb 00:08:44.553 ###### Recommended dictionary. ###### 00:08:44.553 "/N\273\317\177\362.\000" # Uses: 2 00:08:44.553 "\000\000\000\000\005U@\351" # Uses: 0 00:08:44.553 ###### End of recommended dictionary. ###### 00:08:44.553 Done 12 runs in 2 second(s) 00:08:44.812 16:17:13 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:44.812 16:17:13 -- ../common.sh@72 -- # (( i++ )) 00:08:44.812 16:17:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.812 16:17:13 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:44.812 16:17:13 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:44.812 16:17:13 -- vfio/run.sh@23 -- # local timen=1 00:08:44.812 16:17:13 -- vfio/run.sh@24 -- # local core=0x1 00:08:44.812 16:17:13 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.812 16:17:13 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:44.812 16:17:13 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:44.812 16:17:13 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:44.812 16:17:13 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:44.812 16:17:13 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.812 16:17:13 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:44.812 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:44.812 16:17:13 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:44.812 [2024-07-20 16:17:13.522082] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:44.812 [2024-07-20 16:17:13.522153] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2282174 ] 00:08:44.812 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.812 [2024-07-20 16:17:13.586524] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.070 [2024-07-20 16:17:13.623749] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:45.070 [2024-07-20 16:17:13.623886] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.070 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.070 INFO: Seed: 1982345310 00:08:45.070 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:45.070 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:45.070 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:45.070 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.070 #2 INITED exec/s: 0 rss: 60Mb 00:08:45.070 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.070 This may also happen if the target rejected all inputs we tried so far 00:08:45.589 NEW_FUNC[1/632]: 0x492880 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:45.589 NEW_FUNC[2/632]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:45.589 #9 NEW cov: 10703 ft: 10643 corp: 2/88b lim: 320 exec/s: 0 rss: 68Mb L: 87/87 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:45.589 #11 NEW cov: 10717 ft: 13353 corp: 3/185b lim: 320 exec/s: 0 rss: 69Mb L: 97/97 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:45.848 #17 NEW cov: 10717 ft: 14604 corp: 4/282b lim: 320 exec/s: 0 rss: 70Mb L: 97/97 MS: 1 ChangeBinInt- 00:08:45.848 #18 NEW cov: 10717 ft: 15480 corp: 5/379b lim: 320 exec/s: 0 rss: 70Mb L: 97/97 MS: 1 ChangeBinInt- 00:08:46.107 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:46.107 #19 NEW cov: 10734 ft: 15860 corp: 6/430b lim: 320 exec/s: 0 rss: 70Mb L: 51/97 MS: 1 EraseBytes- 00:08:46.107 #20 NEW cov: 10734 ft: 15885 corp: 7/517b lim: 320 exec/s: 20 rss: 70Mb L: 87/97 MS: 1 ChangeBit- 00:08:46.366 #21 NEW cov: 10734 ft: 16245 corp: 8/604b lim: 320 exec/s: 21 rss: 70Mb L: 87/97 MS: 1 ChangeBinInt- 00:08:46.366 #22 NEW cov: 10734 ft: 16279 corp: 9/691b lim: 320 exec/s: 22 rss: 70Mb L: 87/97 MS: 1 ChangeByte- 00:08:46.625 #23 NEW cov: 10734 ft: 16361 corp: 10/788b lim: 320 exec/s: 23 rss: 70Mb L: 97/97 MS: 1 ShuffleBytes- 00:08:46.625 #24 NEW cov: 10734 ft: 16449 corp: 11/941b lim: 320 exec/s: 24 rss: 70Mb L: 153/153 MS: 1 CopyPart- 00:08:46.625 #26 NEW cov: 10734 ft: 16599 corp: 12/984b lim: 320 exec/s: 26 rss: 70Mb L: 43/153 MS: 2 ChangeByte-CrossOver- 00:08:46.883 #27 NEW cov: 10734 ft: 16638 corp: 13/1102b lim: 320 exec/s: 27 rss: 70Mb L: 118/153 MS: 1 InsertRepeatedBytes- 00:08:46.883 #28 NEW cov: 10741 ft: 16894 corp: 14/1199b lim: 320 exec/s: 28 rss: 70Mb L: 97/153 MS: 1 CopyPart- 00:08:47.141 #29 NEW cov: 10741 ft: 17147 corp: 15/1286b lim: 320 exec/s: 29 rss: 70Mb L: 87/153 MS: 1 EraseBytes- 00:08:47.141 #30 NEW cov: 10741 ft: 17188 corp: 16/1349b lim: 320 exec/s: 15 rss: 70Mb L: 63/153 MS: 1 EraseBytes- 00:08:47.141 #30 DONE cov: 10741 ft: 17188 corp: 16/1349b lim: 320 exec/s: 15 rss: 70Mb 00:08:47.141 Done 30 runs in 2 second(s) 00:08:47.399 16:17:16 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:47.399 16:17:16 -- ../common.sh@72 -- # (( i++ )) 00:08:47.399 16:17:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.399 16:17:16 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:47.399 16:17:16 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:47.399 16:17:16 -- vfio/run.sh@23 -- # local timen=1 00:08:47.399 16:17:16 -- vfio/run.sh@24 -- # local core=0x1 00:08:47.400 16:17:16 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.400 16:17:16 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:47.400 16:17:16 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:47.400 16:17:16 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:47.400 16:17:16 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:47.400 16:17:16 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.400 16:17:16 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:47.400 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:47.400 16:17:16 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:47.400 [2024-07-20 16:17:16.175921] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:47.400 [2024-07-20 16:17:16.175995] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2282527 ] 00:08:47.657 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.657 [2024-07-20 16:17:16.248577] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.657 [2024-07-20 16:17:16.284284] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:47.657 [2024-07-20 16:17:16.284453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.657 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.657 INFO: Seed: 342375193 00:08:47.915 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:47.915 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:47.915 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.915 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.915 #2 INITED exec/s: 0 rss: 60Mb 00:08:47.915 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.915 This may also happen if the target rejected all inputs we tried so far 00:08:47.915 [2024-07-20 16:17:16.559540] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:47.915 [2024-07-20 16:17:16.559577] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:47.915 [2024-07-20 16:17:16.559587] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:47.915 [2024-07-20 16:17:16.559608] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:47.915 [2024-07-20 16:17:16.564495] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:47.915 [2024-07-20 16:17:16.564511] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:47.915 [2024-07-20 16:17:16.564531] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:48.173 NEW_FUNC[1/638]: 0x493100 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:48.173 NEW_FUNC[2/638]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:48.173 #9 NEW cov: 10729 ft: 10530 corp: 2/108b lim: 320 exec/s: 0 rss: 67Mb L: 107/107 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:48.432 [2024-07-20 16:17:17.023056] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:48.432 [2024-07-20 16:17:17.023091] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:48.432 [2024-07-20 16:17:17.023101] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:48.432 [2024-07-20 16:17:17.023119] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:48.432 [2024-07-20 16:17:17.024056] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:48.432 [2024-07-20 16:17:17.024076] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:48.432 [2024-07-20 16:17:17.024092] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:48.432 #10 NEW cov: 10743 ft: 13807 corp: 3/215b lim: 320 exec/s: 0 rss: 68Mb L: 107/107 MS: 1 ChangeBit- 00:08:48.432 [2024-07-20 16:17:17.205027] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:48.432 [2024-07-20 16:17:17.205051] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:48.432 [2024-07-20 16:17:17.205061] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:48.432 [2024-07-20 16:17:17.205093] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:48.432 [2024-07-20 16:17:17.206036] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:48.432 [2024-07-20 16:17:17.206055] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:48.432 [2024-07-20 16:17:17.206071] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:48.690 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:48.690 #11 NEW cov: 10760 ft: 14633 corp: 4/322b lim: 320 exec/s: 0 rss: 69Mb L: 107/107 MS: 1 ChangeBinInt- 00:08:48.948 #12 NEW cov: 10764 ft: 14951 corp: 5/380b lim: 320 exec/s: 12 rss: 69Mb L: 58/107 MS: 1 CrossOver- 00:08:48.948 #20 NEW cov: 10764 ft: 16008 corp: 6/468b lim: 320 exec/s: 20 rss: 69Mb L: 88/107 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:49.207 #25 NEW cov: 10764 ft: 16392 corp: 7/584b lim: 320 exec/s: 25 rss: 69Mb L: 116/116 MS: 5 ChangeBit-ChangeByte-CMP-ChangeByte-CrossOver- DE: "\377\377\037\000\023\201q\374"- 00:08:49.207 [2024-07-20 16:17:17.945552] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:49.207 [2024-07-20 16:17:17.945578] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:49.207 [2024-07-20 16:17:17.945588] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:49.207 [2024-07-20 16:17:17.945620] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:49.207 [2024-07-20 16:17:17.946566] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:49.207 [2024-07-20 16:17:17.946585] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:49.207 [2024-07-20 16:17:17.946606] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:49.465 #26 NEW cov: 10764 ft: 16608 corp: 8/691b lim: 320 exec/s: 26 rss: 69Mb L: 107/116 MS: 1 ChangeBinInt- 00:08:49.465 [2024-07-20 16:17:18.128816] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:49.465 [2024-07-20 16:17:18.128839] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:49.465 [2024-07-20 16:17:18.128850] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:49.465 [2024-07-20 16:17:18.128882] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:49.465 [2024-07-20 16:17:18.129802] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:49.465 [2024-07-20 16:17:18.129821] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:49.465 [2024-07-20 16:17:18.129837] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:49.465 #27 NEW cov: 10764 ft: 17004 corp: 9/798b lim: 320 exec/s: 27 rss: 69Mb L: 107/116 MS: 1 ShuffleBytes- 00:08:49.723 [2024-07-20 16:17:18.307773] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:49.723 [2024-07-20 16:17:18.307795] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:49.723 [2024-07-20 16:17:18.307805] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:49.723 [2024-07-20 16:17:18.307837] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:49.723 [2024-07-20 16:17:18.308798] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:49.723 [2024-07-20 16:17:18.308818] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:49.723 [2024-07-20 16:17:18.308835] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:49.723 #28 NEW cov: 10771 ft: 17090 corp: 10/905b lim: 320 exec/s: 28 rss: 69Mb L: 107/116 MS: 1 ChangeBit- 00:08:49.723 [2024-07-20 16:17:18.492218] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:49.723 [2024-07-20 16:17:18.492241] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:49.723 [2024-07-20 16:17:18.492251] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:49.723 [2024-07-20 16:17:18.492268] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:49.724 [2024-07-20 16:17:18.493241] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:49.724 [2024-07-20 16:17:18.493260] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:49.724 [2024-07-20 16:17:18.493275] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:49.983 #29 NEW cov: 10771 ft: 17495 corp: 11/1012b lim: 320 exec/s: 14 rss: 69Mb L: 107/116 MS: 1 ChangeBinInt- 00:08:49.983 #29 DONE cov: 10771 ft: 17495 corp: 11/1012b lim: 320 exec/s: 14 rss: 69Mb 00:08:49.983 ###### Recommended dictionary. ###### 00:08:49.983 "\377\377\037\000\023\201q\374" # Uses: 0 00:08:49.983 ###### End of recommended dictionary. ###### 00:08:49.983 Done 29 runs in 2 second(s) 00:08:50.242 16:17:18 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:50.242 16:17:18 -- ../common.sh@72 -- # (( i++ )) 00:08:50.242 16:17:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.242 16:17:18 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:50.242 16:17:18 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:50.242 16:17:18 -- vfio/run.sh@23 -- # local timen=1 00:08:50.242 16:17:18 -- vfio/run.sh@24 -- # local core=0x1 00:08:50.242 16:17:18 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:50.242 16:17:18 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:50.242 16:17:18 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:50.242 16:17:18 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:50.242 16:17:18 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:50.242 16:17:18 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:50.242 16:17:18 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:50.242 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:50.242 16:17:18 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:50.242 [2024-07-20 16:17:18.904224] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:50.242 [2024-07-20 16:17:18.904318] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2283019 ] 00:08:50.242 EAL: No free 2048 kB hugepages reported on node 1 00:08:50.242 [2024-07-20 16:17:18.977047] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.242 [2024-07-20 16:17:19.012065] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:50.242 [2024-07-20 16:17:19.012214] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.501 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.501 INFO: Seed: 3074369359 00:08:50.501 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:50.501 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:50.501 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:50.501 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.501 #2 INITED exec/s: 0 rss: 60Mb 00:08:50.501 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.501 This may also happen if the target rejected all inputs we tried so far 00:08:50.501 [2024-07-20 16:17:19.294489] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.501 [2024-07-20 16:17:19.294536] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.020 NEW_FUNC[1/638]: 0x493b00 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:51.020 NEW_FUNC[2/638]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:51.020 #3 NEW cov: 10728 ft: 10678 corp: 2/53b lim: 120 exec/s: 0 rss: 66Mb L: 52/52 MS: 1 InsertRepeatedBytes- 00:08:51.020 [2024-07-20 16:17:19.767440] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.020 [2024-07-20 16:17:19.767491] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.278 #4 NEW cov: 10743 ft: 13916 corp: 3/105b lim: 120 exec/s: 0 rss: 68Mb L: 52/52 MS: 1 ChangeBinInt- 00:08:51.278 [2024-07-20 16:17:19.960221] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.278 [2024-07-20 16:17:19.960251] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.278 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:51.278 #5 NEW cov: 10760 ft: 15243 corp: 4/157b lim: 120 exec/s: 0 rss: 69Mb L: 52/52 MS: 1 ChangeBinInt- 00:08:51.536 [2024-07-20 16:17:20.145672] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.536 [2024-07-20 16:17:20.145708] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.536 #6 NEW cov: 10760 ft: 15703 corp: 5/209b lim: 120 exec/s: 6 rss: 69Mb L: 52/52 MS: 1 ChangeBinInt- 00:08:51.536 [2024-07-20 16:17:20.332277] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.537 [2024-07-20 16:17:20.332307] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.795 #7 NEW cov: 10760 ft: 16240 corp: 6/261b lim: 120 exec/s: 7 rss: 69Mb L: 52/52 MS: 1 CMP- DE: "\377\377\037\000\023\201p\000"- 00:08:51.795 [2024-07-20 16:17:20.520527] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.795 [2024-07-20 16:17:20.520557] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.053 #8 NEW cov: 10760 ft: 16428 corp: 7/342b lim: 120 exec/s: 8 rss: 69Mb L: 81/81 MS: 1 CopyPart- 00:08:52.053 [2024-07-20 16:17:20.708798] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.053 [2024-07-20 16:17:20.708830] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.053 #9 NEW cov: 10760 ft: 17075 corp: 8/394b lim: 120 exec/s: 9 rss: 69Mb L: 52/81 MS: 1 ChangeBit- 00:08:52.311 [2024-07-20 16:17:20.898402] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.311 [2024-07-20 16:17:20.898433] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.311 #11 NEW cov: 10760 ft: 17158 corp: 9/434b lim: 120 exec/s: 11 rss: 69Mb L: 40/81 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:52.311 [2024-07-20 16:17:21.095340] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.311 [2024-07-20 16:17:21.095371] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.569 #12 NEW cov: 10767 ft: 17293 corp: 10/515b lim: 120 exec/s: 12 rss: 69Mb L: 81/81 MS: 1 ShuffleBytes- 00:08:52.569 [2024-07-20 16:17:21.283270] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.569 [2024-07-20 16:17:21.283300] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.827 #13 NEW cov: 10767 ft: 17330 corp: 11/596b lim: 120 exec/s: 6 rss: 69Mb L: 81/81 MS: 1 CrossOver- 00:08:52.827 #13 DONE cov: 10767 ft: 17330 corp: 11/596b lim: 120 exec/s: 6 rss: 69Mb 00:08:52.827 ###### Recommended dictionary. ###### 00:08:52.827 "\377\377\037\000\023\201p\000" # Uses: 0 00:08:52.827 ###### End of recommended dictionary. ###### 00:08:52.827 Done 13 runs in 2 second(s) 00:08:53.086 16:17:21 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:53.086 16:17:21 -- ../common.sh@72 -- # (( i++ )) 00:08:53.086 16:17:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:53.086 16:17:21 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:53.086 16:17:21 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:53.086 16:17:21 -- vfio/run.sh@23 -- # local timen=1 00:08:53.086 16:17:21 -- vfio/run.sh@24 -- # local core=0x1 00:08:53.086 16:17:21 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:53.086 16:17:21 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:53.086 16:17:21 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:53.086 16:17:21 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:53.086 16:17:21 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:53.086 16:17:21 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:53.086 16:17:21 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:53.086 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:53.086 16:17:21 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:53.086 [2024-07-20 16:17:21.694849] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 22.11.4 initialization... 00:08:53.086 [2024-07-20 16:17:21.694922] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2283562 ] 00:08:53.086 EAL: No free 2048 kB hugepages reported on node 1 00:08:53.086 [2024-07-20 16:17:21.764962] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.086 [2024-07-20 16:17:21.800320] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:53.086 [2024-07-20 16:17:21.800501] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.344 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.344 INFO: Seed: 1566383038 00:08:53.344 INFO: Loaded 1 modules (338533 inline 8-bit counters): 338533 [0x2624f8c, 0x26779f1), 00:08:53.344 INFO: Loaded 1 PC tables (338533 PCs): 338533 [0x26779f8,0x2ba2048), 00:08:53.345 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:53.345 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.345 #2 INITED exec/s: 0 rss: 59Mb 00:08:53.345 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.345 This may also happen if the target rejected all inputs we tried so far 00:08:53.345 [2024-07-20 16:17:22.053487] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.345 [2024-07-20 16:17:22.053533] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.860 NEW_FUNC[1/637]: 0x4947f0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:53.860 NEW_FUNC[2/637]: 0x496dc0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:53.860 #4 NEW cov: 10722 ft: 10681 corp: 2/44b lim: 90 exec/s: 0 rss: 67Mb L: 43/43 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:53.860 [2024-07-20 16:17:22.485273] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.860 [2024-07-20 16:17:22.485314] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.860 NEW_FUNC[1/1]: 0x1664100 in nvme_payload_type /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:260 00:08:53.860 #5 NEW cov: 10739 ft: 13121 corp: 3/117b lim: 90 exec/s: 0 rss: 68Mb L: 73/73 MS: 1 InsertRepeatedBytes- 00:08:53.860 [2024-07-20 16:17:22.620273] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.860 [2024-07-20 16:17:22.620308] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.118 #6 NEW cov: 10739 ft: 13679 corp: 4/146b lim: 90 exec/s: 0 rss: 69Mb L: 29/73 MS: 1 EraseBytes- 00:08:54.118 [2024-07-20 16:17:22.735156] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.118 [2024-07-20 16:17:22.735192] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.118 NEW_FUNC[1/1]: 0x193de20 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:54.118 #7 NEW cov: 10756 ft: 14753 corp: 5/232b lim: 90 exec/s: 0 rss: 69Mb L: 86/86 MS: 1 CopyPart- 00:08:54.118 [2024-07-20 16:17:22.849943] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.118 [2024-07-20 16:17:22.849980] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.118 #8 NEW cov: 10756 ft: 15311 corp: 6/318b lim: 90 exec/s: 0 rss: 69Mb L: 86/86 MS: 1 ShuffleBytes- 00:08:54.376 [2024-07-20 16:17:22.964784] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.376 [2024-07-20 16:17:22.964817] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.376 #9 NEW cov: 10756 ft: 15388 corp: 7/361b lim: 90 exec/s: 9 rss: 69Mb L: 43/86 MS: 1 CopyPart- 00:08:54.376 [2024-07-20 16:17:23.079598] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.376 [2024-07-20 16:17:23.079637] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.376 #10 NEW cov: 10756 ft: 15712 corp: 8/448b lim: 90 exec/s: 10 rss: 69Mb L: 87/87 MS: 1 InsertByte- 00:08:54.634 [2024-07-20 16:17:23.204558] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.634 [2024-07-20 16:17:23.204602] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.634 #11 NEW cov: 10756 ft: 15791 corp: 9/521b lim: 90 exec/s: 11 rss: 69Mb L: 73/87 MS: 1 CrossOver- 00:08:54.634 [2024-07-20 16:17:23.319496] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.634 [2024-07-20 16:17:23.319528] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.634 #12 NEW cov: 10756 ft: 16116 corp: 10/610b lim: 90 exec/s: 12 rss: 69Mb L: 89/89 MS: 1 CopyPart- 00:08:54.634 [2024-07-20 16:17:23.434428] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.634 [2024-07-20 16:17:23.434467] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.891 #13 NEW cov: 10756 ft: 16316 corp: 11/664b lim: 90 exec/s: 13 rss: 69Mb L: 54/89 MS: 1 InsertRepeatedBytes- 00:08:54.891 [2024-07-20 16:17:23.549243] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.891 [2024-07-20 16:17:23.549275] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.891 #15 NEW cov: 10756 ft: 16377 corp: 12/726b lim: 90 exec/s: 15 rss: 69Mb L: 62/89 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:54.891 [2024-07-20 16:17:23.663071] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.891 [2024-07-20 16:17:23.663103] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.148 #16 NEW cov: 10756 ft: 16407 corp: 13/781b lim: 90 exec/s: 16 rss: 69Mb L: 55/89 MS: 1 InsertByte- 00:08:55.148 [2024-07-20 16:17:23.777849] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.148 [2024-07-20 16:17:23.777883] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.148 #17 NEW cov: 10763 ft: 16703 corp: 14/824b lim: 90 exec/s: 17 rss: 69Mb L: 43/89 MS: 1 ShuffleBytes- 00:08:55.148 [2024-07-20 16:17:23.892713] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.148 [2024-07-20 16:17:23.892748] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.406 #18 NEW cov: 10763 ft: 16750 corp: 15/913b lim: 90 exec/s: 18 rss: 70Mb L: 89/89 MS: 1 ChangeByte- 00:08:55.406 [2024-07-20 16:17:24.007437] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.406 [2024-07-20 16:17:24.007479] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.406 #19 NEW cov: 10763 ft: 16967 corp: 16/948b lim: 90 exec/s: 9 rss: 70Mb L: 35/89 MS: 1 CopyPart- 00:08:55.406 #19 DONE cov: 10763 ft: 16967 corp: 16/948b lim: 90 exec/s: 9 rss: 70Mb 00:08:55.406 Done 19 runs in 2 second(s) 00:08:55.664 16:17:24 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:55.664 16:17:24 -- ../common.sh@72 -- # (( i++ )) 00:08:55.664 16:17:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.664 16:17:24 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:55.664 00:08:55.664 real 0m19.204s 00:08:55.664 user 0m26.751s 00:08:55.664 sys 0m1.804s 00:08:55.664 16:17:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:55.664 16:17:24 -- common/autotest_common.sh@10 -- # set +x 00:08:55.664 ************************************ 00:08:55.664 END TEST vfio_fuzz 00:08:55.664 ************************************ 00:08:55.664 16:17:24 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:55.664 00:08:55.664 real 1m21.659s 00:08:55.664 user 2m5.338s 00:08:55.664 sys 0m9.143s 00:08:55.664 16:17:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:55.664 16:17:24 -- common/autotest_common.sh@10 -- # set +x 00:08:55.664 ************************************ 00:08:55.664 END TEST llvm_fuzz 00:08:55.664 ************************************ 00:08:55.664 16:17:24 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:08:55.664 16:17:24 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:08:55.664 16:17:24 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:08:55.664 16:17:24 -- common/autotest_common.sh@712 -- # xtrace_disable 00:08:55.664 16:17:24 -- common/autotest_common.sh@10 -- # set +x 00:08:55.664 16:17:24 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:08:55.664 16:17:24 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:08:55.664 16:17:24 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:08:55.664 16:17:24 -- common/autotest_common.sh@10 -- # set +x 00:09:02.222 INFO: APP EXITING 00:09:02.222 INFO: killing all VMs 00:09:02.222 INFO: killing vhost app 00:09:02.222 INFO: EXIT DONE 00:09:04.755 Waiting for block devices as requested 00:09:04.755 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.755 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.755 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:05.013 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:05.013 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:05.013 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:05.013 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:05.271 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:05.271 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:05.271 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:05.529 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:05.529 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:05.529 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:05.818 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:05.818 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:05.818 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:06.121 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:09.407 Cleaning 00:09:09.407 Removing: /dev/shm/spdk_tgt_trace.pid2247067 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2244610 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2245862 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2247067 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2247767 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2248051 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2248362 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2248694 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2248970 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2249182 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2249466 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2249773 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2250630 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2253573 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2253875 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2254176 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2254439 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2255011 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2255085 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2255598 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2255866 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2256170 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2256220 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2256478 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2256707 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2257213 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2257525 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2257748 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2258073 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2258613 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2258639 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2258828 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2258980 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2259257 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2259523 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2259812 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2260078 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2260307 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2260454 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2260672 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2260940 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2261221 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2261489 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2261776 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2261932 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2262121 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2262360 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2262644 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2262913 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2263197 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2263436 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2263627 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2263779 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2264063 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2264331 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2264612 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2264880 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2265092 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2265238 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2265488 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2265760 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2266043 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2266309 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2266592 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2266758 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2266938 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2267180 00:09:09.407 Removing: /var/run/dpdk/spdk_pid2267473 00:09:09.665 Removing: /var/run/dpdk/spdk_pid2267742 00:09:09.665 Removing: /var/run/dpdk/spdk_pid2268023 00:09:09.665 Removing: /var/run/dpdk/spdk_pid2268279 00:09:09.665 Removing: /var/run/dpdk/spdk_pid2268470 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2268647 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2268800 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2269449 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2269884 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2270276 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2270823 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2271131 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2271657 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2272167 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2272489 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2273028 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2273442 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2273862 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2274399 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2274693 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2275230 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2275712 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2276066 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2276607 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2276929 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2277434 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2277944 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2278267 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2278810 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2279186 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2279643 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2280173 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2280640 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2281090 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2281634 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2282174 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2282527 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2283019 00:09:09.666 Removing: /var/run/dpdk/spdk_pid2283562 00:09:09.666 Clean 00:09:09.666 killing process with pid 2199862 00:09:13.851 killing process with pid 2199859 00:09:13.851 killing process with pid 2199861 00:09:13.851 killing process with pid 2199860 00:09:13.851 16:17:41 -- common/autotest_common.sh@1436 -- # return 0 00:09:13.851 16:17:41 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:09:13.851 16:17:41 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:13.851 16:17:41 -- common/autotest_common.sh@10 -- # set +x 00:09:13.851 16:17:42 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:09:13.851 16:17:42 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:13.851 16:17:42 -- common/autotest_common.sh@10 -- # set +x 00:09:13.851 16:17:42 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:13.851 16:17:42 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:13.851 16:17:42 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:13.851 16:17:42 -- spdk/autotest.sh@394 -- # hash lcov 00:09:13.851 16:17:42 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:13.851 16:17:42 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:13.851 16:17:42 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:13.851 16:17:42 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:13.851 16:17:42 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:13.851 16:17:42 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:13.851 16:17:42 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:13.851 16:17:42 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:13.851 16:17:42 -- paths/export.sh@5 -- $ export PATH 00:09:13.851 16:17:42 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:13.851 16:17:42 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:13.851 16:17:42 -- common/autobuild_common.sh@435 -- $ date +%s 00:09:13.851 16:17:42 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1721485062.XXXXXX 00:09:13.851 16:17:42 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1721485062.nBLvis 00:09:13.851 16:17:42 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:09:13.851 16:17:42 -- common/autobuild_common.sh@441 -- $ '[' -n v22.11.4 ']' 00:09:13.851 16:17:42 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:13.851 16:17:42 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:13.851 16:17:42 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:13.851 16:17:42 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:13.851 16:17:42 -- common/autobuild_common.sh@451 -- $ get_config_params 00:09:13.851 16:17:42 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:09:13.851 16:17:42 -- common/autotest_common.sh@10 -- $ set +x 00:09:13.851 16:17:42 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:13.851 16:17:42 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:13.851 16:17:42 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:13.851 16:17:42 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:13.851 16:17:42 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:13.851 16:17:42 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:13.851 16:17:42 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:13.851 16:17:42 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:13.851 16:17:42 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:13.851 16:17:42 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:13.851 16:17:42 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:13.851 + [[ -n 2144537 ]] 00:09:13.851 + sudo kill 2144537 00:09:13.860 [Pipeline] } 00:09:13.878 [Pipeline] // stage 00:09:13.882 [Pipeline] } 00:09:13.897 [Pipeline] // timeout 00:09:13.902 [Pipeline] } 00:09:13.917 [Pipeline] // catchError 00:09:13.922 [Pipeline] } 00:09:13.940 [Pipeline] // wrap 00:09:13.946 [Pipeline] } 00:09:13.960 [Pipeline] // catchError 00:09:13.969 [Pipeline] stage 00:09:13.972 [Pipeline] { (Epilogue) 00:09:13.986 [Pipeline] catchError 00:09:13.988 [Pipeline] { 00:09:14.002 [Pipeline] echo 00:09:14.004 Cleanup processes 00:09:14.009 [Pipeline] sh 00:09:14.290 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:14.290 2292347 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:14.304 [Pipeline] sh 00:09:14.584 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:14.584 ++ grep -v 'sudo pgrep' 00:09:14.584 ++ awk '{print $1}' 00:09:14.584 + sudo kill -9 00:09:14.584 + true 00:09:14.610 [Pipeline] sh 00:09:14.889 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:14.889 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:14.889 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:15.835 [Pipeline] sh 00:09:16.132 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:16.132 Artifacts sizes are good 00:09:16.146 [Pipeline] archiveArtifacts 00:09:16.153 Archiving artifacts 00:09:16.204 [Pipeline] sh 00:09:16.485 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:16.497 [Pipeline] cleanWs 00:09:16.504 [WS-CLEANUP] Deleting project workspace... 00:09:16.504 [WS-CLEANUP] Deferred wipeout is used... 00:09:16.510 [WS-CLEANUP] done 00:09:16.511 [Pipeline] } 00:09:16.530 [Pipeline] // catchError 00:09:16.540 [Pipeline] sh 00:09:16.821 + logger -p user.info -t JENKINS-CI 00:09:16.830 [Pipeline] } 00:09:16.846 [Pipeline] // stage 00:09:16.851 [Pipeline] } 00:09:16.865 [Pipeline] // node 00:09:16.870 [Pipeline] End of Pipeline 00:09:16.973 Finished: SUCCESS