00:00:00.002 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 841 00:00:00.002 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3506 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.002 Started by timer 00:00:00.053 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.055 The recommended git tool is: git 00:00:00.055 using credential 00000000-0000-0000-0000-000000000002 00:00:00.060 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.079 Fetching changes from the remote Git repository 00:00:00.083 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.114 Using shallow fetch with depth 1 00:00:00.114 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.114 > git --version # timeout=10 00:00:00.167 > git --version # 'git version 2.39.2' 00:00:00.167 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.208 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.208 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.715 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.725 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.736 Checking out Revision 4f3f5a4a30726c4eea24a2c31f6bdf50c75a515d (FETCH_HEAD) 00:00:04.736 > git config core.sparsecheckout # timeout=10 00:00:04.747 > git read-tree -mu HEAD # timeout=10 00:00:04.764 > git checkout -f 4f3f5a4a30726c4eea24a2c31f6bdf50c75a515d # timeout=5 00:00:04.783 Commit message: "jenkins/jjb-config: Adjust vs-dpdk config for v24.09" 00:00:04.783 > git rev-list --no-walk 4f3f5a4a30726c4eea24a2c31f6bdf50c75a515d # timeout=10 00:00:04.862 [Pipeline] Start of Pipeline 00:00:04.873 [Pipeline] library 00:00:04.874 Loading library shm_lib@master 00:00:04.874 Library shm_lib@master is cached. Copying from home. 00:00:04.889 [Pipeline] node 00:00:04.904 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:04.906 [Pipeline] { 00:00:04.917 [Pipeline] catchError 00:00:04.918 [Pipeline] { 00:00:04.931 [Pipeline] wrap 00:00:04.939 [Pipeline] { 00:00:04.945 [Pipeline] stage 00:00:04.946 [Pipeline] { (Prologue) 00:00:05.219 [Pipeline] sh 00:00:05.502 + logger -p user.info -t JENKINS-CI 00:00:05.515 [Pipeline] echo 00:00:05.516 Node: WFP20 00:00:05.523 [Pipeline] sh 00:00:05.819 [Pipeline] setCustomBuildProperty 00:00:05.827 [Pipeline] echo 00:00:05.828 Cleanup processes 00:00:05.832 [Pipeline] sh 00:00:06.114 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.114 886058 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.126 [Pipeline] sh 00:00:06.412 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.412 ++ grep -v 'sudo pgrep' 00:00:06.412 ++ awk '{print $1}' 00:00:06.412 + sudo kill -9 00:00:06.412 + true 00:00:06.425 [Pipeline] cleanWs 00:00:06.433 [WS-CLEANUP] Deleting project workspace... 00:00:06.433 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.439 [WS-CLEANUP] done 00:00:06.444 [Pipeline] setCustomBuildProperty 00:00:06.462 [Pipeline] sh 00:00:06.748 + sudo git config --global --replace-all safe.directory '*' 00:00:06.832 [Pipeline] httpRequest 00:00:07.428 [Pipeline] echo 00:00:07.430 Sorcerer 10.211.164.101 is alive 00:00:07.438 [Pipeline] retry 00:00:07.440 [Pipeline] { 00:00:07.452 [Pipeline] httpRequest 00:00:07.457 HttpMethod: GET 00:00:07.457 URL: http://10.211.164.101/packages/jbp_4f3f5a4a30726c4eea24a2c31f6bdf50c75a515d.tar.gz 00:00:07.458 Sending request to url: http://10.211.164.101/packages/jbp_4f3f5a4a30726c4eea24a2c31f6bdf50c75a515d.tar.gz 00:00:07.472 Response Code: HTTP/1.1 200 OK 00:00:07.473 Success: Status code 200 is in the accepted range: 200,404 00:00:07.473 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_4f3f5a4a30726c4eea24a2c31f6bdf50c75a515d.tar.gz 00:00:22.797 [Pipeline] } 00:00:22.811 [Pipeline] // retry 00:00:22.815 [Pipeline] sh 00:00:23.095 + tar --no-same-owner -xf jbp_4f3f5a4a30726c4eea24a2c31f6bdf50c75a515d.tar.gz 00:00:23.110 [Pipeline] httpRequest 00:00:23.527 [Pipeline] echo 00:00:23.529 Sorcerer 10.211.164.101 is alive 00:00:23.539 [Pipeline] retry 00:00:23.541 [Pipeline] { 00:00:23.555 [Pipeline] httpRequest 00:00:23.558 HttpMethod: GET 00:00:23.559 URL: http://10.211.164.101/packages/spdk_726a04d705a30cca40ac8dc8d45f839602005b7a.tar.gz 00:00:23.559 Sending request to url: http://10.211.164.101/packages/spdk_726a04d705a30cca40ac8dc8d45f839602005b7a.tar.gz 00:00:23.583 Response Code: HTTP/1.1 200 OK 00:00:23.584 Success: Status code 200 is in the accepted range: 200,404 00:00:23.584 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_726a04d705a30cca40ac8dc8d45f839602005b7a.tar.gz 00:02:23.560 [Pipeline] } 00:02:23.580 [Pipeline] // retry 00:02:23.587 [Pipeline] sh 00:02:23.875 + tar --no-same-owner -xf spdk_726a04d705a30cca40ac8dc8d45f839602005b7a.tar.gz 00:02:26.426 [Pipeline] sh 00:02:26.712 + git -C spdk log --oneline -n5 00:02:26.712 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:02:26.712 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:02:26.712 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:02:26.712 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:02:26.712 9469ea403 nvme/fio_plugin: add trim support 00:02:26.732 [Pipeline] withCredentials 00:02:26.744 > git --version # timeout=10 00:02:26.759 > git --version # 'git version 2.39.2' 00:02:26.777 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:02:26.780 [Pipeline] { 00:02:26.789 [Pipeline] retry 00:02:26.791 [Pipeline] { 00:02:26.807 [Pipeline] sh 00:02:27.120 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:02:27.133 [Pipeline] } 00:02:27.193 [Pipeline] // retry 00:02:27.197 [Pipeline] } 00:02:27.209 [Pipeline] // withCredentials 00:02:27.215 [Pipeline] httpRequest 00:02:27.622 [Pipeline] echo 00:02:27.624 Sorcerer 10.211.164.101 is alive 00:02:27.633 [Pipeline] retry 00:02:27.635 [Pipeline] { 00:02:27.649 [Pipeline] httpRequest 00:02:27.654 HttpMethod: GET 00:02:27.654 URL: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:27.655 Sending request to url: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:27.669 Response Code: HTTP/1.1 200 OK 00:02:27.669 Success: Status code 200 is in the accepted range: 200,404 00:02:27.669 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:42.230 [Pipeline] } 00:02:42.246 [Pipeline] // retry 00:02:42.253 [Pipeline] sh 00:02:42.538 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:43.931 [Pipeline] sh 00:02:44.217 + git -C dpdk log --oneline -n5 00:02:44.217 caf0f5d395 version: 22.11.4 00:02:44.217 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:02:44.217 dc9c799c7d vhost: fix missing spinlock unlock 00:02:44.217 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:02:44.217 6ef77f2a5e net/gve: fix RX buffer size alignment 00:02:44.228 [Pipeline] } 00:02:44.242 [Pipeline] // stage 00:02:44.251 [Pipeline] stage 00:02:44.253 [Pipeline] { (Prepare) 00:02:44.273 [Pipeline] writeFile 00:02:44.289 [Pipeline] sh 00:02:44.574 + logger -p user.info -t JENKINS-CI 00:02:44.587 [Pipeline] sh 00:02:44.872 + logger -p user.info -t JENKINS-CI 00:02:44.885 [Pipeline] sh 00:02:45.170 + cat autorun-spdk.conf 00:02:45.170 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:45.170 SPDK_RUN_UBSAN=1 00:02:45.170 SPDK_TEST_FUZZER=1 00:02:45.170 SPDK_TEST_FUZZER_SHORT=1 00:02:45.170 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:45.170 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:45.178 RUN_NIGHTLY=1 00:02:45.182 [Pipeline] readFile 00:02:45.207 [Pipeline] withEnv 00:02:45.209 [Pipeline] { 00:02:45.221 [Pipeline] sh 00:02:45.508 + set -ex 00:02:45.508 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:02:45.508 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:45.508 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:45.508 ++ SPDK_RUN_UBSAN=1 00:02:45.508 ++ SPDK_TEST_FUZZER=1 00:02:45.508 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:45.508 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:45.508 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:45.508 ++ RUN_NIGHTLY=1 00:02:45.508 + case $SPDK_TEST_NVMF_NICS in 00:02:45.508 + DRIVERS= 00:02:45.508 + [[ -n '' ]] 00:02:45.508 + exit 0 00:02:45.517 [Pipeline] } 00:02:45.532 [Pipeline] // withEnv 00:02:45.537 [Pipeline] } 00:02:45.551 [Pipeline] // stage 00:02:45.561 [Pipeline] catchError 00:02:45.563 [Pipeline] { 00:02:45.577 [Pipeline] timeout 00:02:45.577 Timeout set to expire in 30 min 00:02:45.579 [Pipeline] { 00:02:45.594 [Pipeline] stage 00:02:45.596 [Pipeline] { (Tests) 00:02:45.610 [Pipeline] sh 00:02:45.896 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:45.897 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:45.897 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:02:45.897 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:02:45.897 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:45.897 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:45.897 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:02:45.897 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:45.897 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:45.897 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:45.897 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:02:45.897 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:45.897 + source /etc/os-release 00:02:45.897 ++ NAME='Fedora Linux' 00:02:45.897 ++ VERSION='39 (Cloud Edition)' 00:02:45.897 ++ ID=fedora 00:02:45.897 ++ VERSION_ID=39 00:02:45.897 ++ VERSION_CODENAME= 00:02:45.897 ++ PLATFORM_ID=platform:f39 00:02:45.897 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:45.897 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:45.897 ++ LOGO=fedora-logo-icon 00:02:45.897 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:45.897 ++ HOME_URL=https://fedoraproject.org/ 00:02:45.897 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:45.897 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:45.897 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:45.897 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:45.897 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:45.897 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:45.897 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:45.897 ++ SUPPORT_END=2024-11-12 00:02:45.897 ++ VARIANT='Cloud Edition' 00:02:45.897 ++ VARIANT_ID=cloud 00:02:45.897 + uname -a 00:02:45.897 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:45.897 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:48.440 Hugepages 00:02:48.440 node hugesize free / total 00:02:48.701 node0 1048576kB 0 / 0 00:02:48.701 node0 2048kB 0 / 0 00:02:48.701 node1 1048576kB 0 / 0 00:02:48.701 node1 2048kB 0 / 0 00:02:48.701 00:02:48.701 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:48.701 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:48.701 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:48.701 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:48.701 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:48.701 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:48.701 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:48.701 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:48.701 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:48.701 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:48.701 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:48.701 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:48.701 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:48.701 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:48.701 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:48.701 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:48.701 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:48.701 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:48.701 + rm -f /tmp/spdk-ld-path 00:02:48.701 + source autorun-spdk.conf 00:02:48.701 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:48.701 ++ SPDK_RUN_UBSAN=1 00:02:48.701 ++ SPDK_TEST_FUZZER=1 00:02:48.701 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:48.701 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:48.701 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:48.701 ++ RUN_NIGHTLY=1 00:02:48.701 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:48.701 + [[ -n '' ]] 00:02:48.701 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:48.701 + for M in /var/spdk/build-*-manifest.txt 00:02:48.701 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:48.701 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:48.701 + for M in /var/spdk/build-*-manifest.txt 00:02:48.701 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:48.701 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:48.701 + for M in /var/spdk/build-*-manifest.txt 00:02:48.701 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:48.701 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:48.701 ++ uname 00:02:48.701 + [[ Linux == \L\i\n\u\x ]] 00:02:48.701 + sudo dmesg -T 00:02:48.961 + sudo dmesg --clear 00:02:48.961 + dmesg_pid=887538 00:02:48.961 + [[ Fedora Linux == FreeBSD ]] 00:02:48.961 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:48.961 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:48.961 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:48.961 + [[ -x /usr/src/fio-static/fio ]] 00:02:48.961 + export FIO_BIN=/usr/src/fio-static/fio 00:02:48.961 + FIO_BIN=/usr/src/fio-static/fio 00:02:48.961 + sudo dmesg -Tw 00:02:48.961 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:48.961 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:48.961 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:48.961 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:48.961 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:48.961 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:48.961 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:48.961 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:48.961 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:48.961 Test configuration: 00:02:48.961 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:48.961 SPDK_RUN_UBSAN=1 00:02:48.961 SPDK_TEST_FUZZER=1 00:02:48.961 SPDK_TEST_FUZZER_SHORT=1 00:02:48.961 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:48.961 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:48.961 RUN_NIGHTLY=1 08:20:41 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:48.961 08:20:41 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:48.961 08:20:41 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:48.961 08:20:41 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:48.961 08:20:41 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:48.962 08:20:41 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:48.962 08:20:41 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:48.962 08:20:41 -- paths/export.sh@5 -- $ export PATH 00:02:48.962 08:20:41 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:48.962 08:20:41 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:48.962 08:20:41 -- common/autobuild_common.sh@440 -- $ date +%s 00:02:48.962 08:20:41 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1728022841.XXXXXX 00:02:48.962 08:20:41 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1728022841.AfwrgU 00:02:48.962 08:20:41 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:02:48.962 08:20:41 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:48.962 08:20:41 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:02:48.962 08:20:41 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:48.962 08:20:41 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:48.962 08:20:41 -- common/autobuild_common.sh@456 -- $ get_config_params 00:02:48.962 08:20:41 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:02:48.962 08:20:41 -- common/autotest_common.sh@10 -- $ set +x 00:02:48.962 08:20:41 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:02:48.962 08:20:41 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:48.962 08:20:41 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:48.962 08:20:41 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:48.962 08:20:41 -- spdk/autobuild.sh@16 -- $ date -u 00:02:48.962 Fri Oct 4 06:20:41 AM UTC 2024 00:02:48.962 08:20:41 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:48.962 LTS-66-g726a04d70 00:02:48.962 08:20:41 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:48.962 08:20:41 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:48.962 08:20:41 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:48.962 08:20:41 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:48.962 08:20:41 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:48.962 08:20:41 -- common/autotest_common.sh@10 -- $ set +x 00:02:48.962 ************************************ 00:02:48.962 START TEST ubsan 00:02:48.962 ************************************ 00:02:48.962 08:20:41 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:02:48.962 using ubsan 00:02:48.962 00:02:48.962 real 0m0.000s 00:02:48.962 user 0m0.000s 00:02:48.962 sys 0m0.000s 00:02:48.962 08:20:41 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:48.962 08:20:41 -- common/autotest_common.sh@10 -- $ set +x 00:02:48.962 ************************************ 00:02:48.962 END TEST ubsan 00:02:48.962 ************************************ 00:02:48.962 08:20:41 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:02:48.962 08:20:41 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:48.962 08:20:41 -- common/autobuild_common.sh@432 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:48.962 08:20:41 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:02:48.962 08:20:41 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:48.962 08:20:41 -- common/autotest_common.sh@10 -- $ set +x 00:02:48.962 ************************************ 00:02:48.962 START TEST build_native_dpdk 00:02:48.962 ************************************ 00:02:48.962 08:20:41 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:02:48.962 08:20:41 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:48.962 08:20:41 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:48.962 08:20:41 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:48.962 08:20:41 -- common/autobuild_common.sh@51 -- $ local compiler 00:02:48.962 08:20:41 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:48.962 08:20:41 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:48.962 08:20:41 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:48.962 08:20:41 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:48.962 08:20:41 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:48.962 08:20:41 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:48.962 08:20:41 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:48.962 08:20:41 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:48.962 08:20:41 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:48.962 08:20:41 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:48.962 08:20:41 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:48.962 08:20:41 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:48.962 08:20:41 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:02:48.962 caf0f5d395 version: 22.11.4 00:02:48.962 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:02:48.962 dc9c799c7d vhost: fix missing spinlock unlock 00:02:48.962 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:02:48.962 6ef77f2a5e net/gve: fix RX buffer size alignment 00:02:48.962 08:20:41 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:48.962 08:20:41 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:48.962 08:20:41 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:02:48.962 08:20:41 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:48.962 08:20:41 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:48.962 08:20:41 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:02:48.962 08:20:41 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:02:48.962 08:20:41 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:02:48.962 08:20:41 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:48.962 08:20:41 -- common/autobuild_common.sh@168 -- $ uname -s 00:02:48.962 08:20:41 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:02:48.962 08:20:41 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:02:48.962 08:20:41 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:02:48.962 08:20:41 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:02:48.962 08:20:41 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:02:48.962 08:20:41 -- scripts/common.sh@335 -- $ IFS=.-: 00:02:48.962 08:20:41 -- scripts/common.sh@335 -- $ read -ra ver1 00:02:48.962 08:20:41 -- scripts/common.sh@336 -- $ IFS=.-: 00:02:48.962 08:20:41 -- scripts/common.sh@336 -- $ read -ra ver2 00:02:48.962 08:20:41 -- scripts/common.sh@337 -- $ local 'op=<' 00:02:48.962 08:20:41 -- scripts/common.sh@339 -- $ ver1_l=3 00:02:48.962 08:20:41 -- scripts/common.sh@340 -- $ ver2_l=3 00:02:48.962 08:20:41 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:02:48.962 08:20:41 -- scripts/common.sh@343 -- $ case "$op" in 00:02:48.962 08:20:41 -- scripts/common.sh@344 -- $ : 1 00:02:48.962 08:20:41 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:02:48.962 08:20:41 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:49.222 08:20:41 -- scripts/common.sh@364 -- $ decimal 22 00:02:49.222 08:20:41 -- scripts/common.sh@352 -- $ local d=22 00:02:49.222 08:20:41 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:49.222 08:20:41 -- scripts/common.sh@354 -- $ echo 22 00:02:49.222 08:20:41 -- scripts/common.sh@364 -- $ ver1[v]=22 00:02:49.222 08:20:41 -- scripts/common.sh@365 -- $ decimal 21 00:02:49.222 08:20:41 -- scripts/common.sh@352 -- $ local d=21 00:02:49.222 08:20:41 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:49.222 08:20:41 -- scripts/common.sh@354 -- $ echo 21 00:02:49.222 08:20:41 -- scripts/common.sh@365 -- $ ver2[v]=21 00:02:49.222 08:20:41 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:02:49.222 08:20:41 -- scripts/common.sh@366 -- $ return 1 00:02:49.222 08:20:41 -- common/autobuild_common.sh@173 -- $ patch -p1 00:02:49.222 patching file config/rte_config.h 00:02:49.222 Hunk #1 succeeded at 60 (offset 1 line). 00:02:49.222 08:20:41 -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:02:49.222 08:20:41 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:02:49.222 08:20:41 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:02:49.222 08:20:41 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:02:49.222 08:20:41 -- scripts/common.sh@335 -- $ IFS=.-: 00:02:49.222 08:20:41 -- scripts/common.sh@335 -- $ read -ra ver1 00:02:49.222 08:20:41 -- scripts/common.sh@336 -- $ IFS=.-: 00:02:49.222 08:20:41 -- scripts/common.sh@336 -- $ read -ra ver2 00:02:49.222 08:20:41 -- scripts/common.sh@337 -- $ local 'op=<' 00:02:49.222 08:20:41 -- scripts/common.sh@339 -- $ ver1_l=3 00:02:49.222 08:20:41 -- scripts/common.sh@340 -- $ ver2_l=3 00:02:49.222 08:20:41 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:02:49.222 08:20:41 -- scripts/common.sh@343 -- $ case "$op" in 00:02:49.222 08:20:41 -- scripts/common.sh@344 -- $ : 1 00:02:49.222 08:20:41 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:02:49.222 08:20:41 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:49.222 08:20:41 -- scripts/common.sh@364 -- $ decimal 22 00:02:49.222 08:20:41 -- scripts/common.sh@352 -- $ local d=22 00:02:49.222 08:20:41 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:49.222 08:20:41 -- scripts/common.sh@354 -- $ echo 22 00:02:49.222 08:20:41 -- scripts/common.sh@364 -- $ ver1[v]=22 00:02:49.222 08:20:41 -- scripts/common.sh@365 -- $ decimal 24 00:02:49.222 08:20:41 -- scripts/common.sh@352 -- $ local d=24 00:02:49.222 08:20:41 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:49.222 08:20:41 -- scripts/common.sh@354 -- $ echo 24 00:02:49.222 08:20:41 -- scripts/common.sh@365 -- $ ver2[v]=24 00:02:49.222 08:20:41 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:02:49.222 08:20:41 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:02:49.222 08:20:41 -- scripts/common.sh@367 -- $ return 0 00:02:49.222 08:20:41 -- common/autobuild_common.sh@177 -- $ patch -p1 00:02:49.222 patching file lib/pcapng/rte_pcapng.c 00:02:49.222 Hunk #1 succeeded at 110 (offset -18 lines). 00:02:49.222 08:20:41 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:02:49.222 08:20:41 -- common/autobuild_common.sh@181 -- $ uname -s 00:02:49.222 08:20:41 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:02:49.222 08:20:41 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:02:49.222 08:20:41 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:54.505 The Meson build system 00:02:54.505 Version: 1.5.0 00:02:54.505 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:54.505 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:54.505 Build type: native build 00:02:54.505 Program cat found: YES (/usr/bin/cat) 00:02:54.505 Project name: DPDK 00:02:54.505 Project version: 22.11.4 00:02:54.505 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:54.505 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:54.505 Host machine cpu family: x86_64 00:02:54.505 Host machine cpu: x86_64 00:02:54.505 Message: ## Building in Developer Mode ## 00:02:54.505 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:54.505 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:54.505 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:54.505 Program objdump found: YES (/usr/bin/objdump) 00:02:54.505 Program python3 found: YES (/usr/bin/python3) 00:02:54.505 Program cat found: YES (/usr/bin/cat) 00:02:54.505 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:54.505 Checking for size of "void *" : 8 00:02:54.505 Checking for size of "void *" : 8 (cached) 00:02:54.505 Library m found: YES 00:02:54.505 Library numa found: YES 00:02:54.505 Has header "numaif.h" : YES 00:02:54.505 Library fdt found: NO 00:02:54.505 Library execinfo found: NO 00:02:54.505 Has header "execinfo.h" : YES 00:02:54.505 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:54.505 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:54.505 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:54.505 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:54.505 Run-time dependency openssl found: YES 3.1.1 00:02:54.505 Run-time dependency libpcap found: YES 1.10.4 00:02:54.505 Has header "pcap.h" with dependency libpcap: YES 00:02:54.505 Compiler for C supports arguments -Wcast-qual: YES 00:02:54.505 Compiler for C supports arguments -Wdeprecated: YES 00:02:54.505 Compiler for C supports arguments -Wformat: YES 00:02:54.505 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:54.505 Compiler for C supports arguments -Wformat-security: NO 00:02:54.505 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:54.505 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:54.505 Compiler for C supports arguments -Wnested-externs: YES 00:02:54.505 Compiler for C supports arguments -Wold-style-definition: YES 00:02:54.505 Compiler for C supports arguments -Wpointer-arith: YES 00:02:54.505 Compiler for C supports arguments -Wsign-compare: YES 00:02:54.505 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:54.505 Compiler for C supports arguments -Wundef: YES 00:02:54.505 Compiler for C supports arguments -Wwrite-strings: YES 00:02:54.505 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:54.505 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:54.505 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:54.505 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:54.505 Compiler for C supports arguments -mavx512f: YES 00:02:54.505 Checking if "AVX512 checking" compiles: YES 00:02:54.505 Fetching value of define "__SSE4_2__" : 1 00:02:54.505 Fetching value of define "__AES__" : 1 00:02:54.505 Fetching value of define "__AVX__" : 1 00:02:54.505 Fetching value of define "__AVX2__" : 1 00:02:54.505 Fetching value of define "__AVX512BW__" : 1 00:02:54.505 Fetching value of define "__AVX512CD__" : 1 00:02:54.505 Fetching value of define "__AVX512DQ__" : 1 00:02:54.505 Fetching value of define "__AVX512F__" : 1 00:02:54.505 Fetching value of define "__AVX512VL__" : 1 00:02:54.506 Fetching value of define "__PCLMUL__" : 1 00:02:54.506 Fetching value of define "__RDRND__" : 1 00:02:54.506 Fetching value of define "__RDSEED__" : 1 00:02:54.506 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:54.506 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:54.506 Message: lib/kvargs: Defining dependency "kvargs" 00:02:54.506 Message: lib/telemetry: Defining dependency "telemetry" 00:02:54.506 Checking for function "getentropy" : YES 00:02:54.506 Message: lib/eal: Defining dependency "eal" 00:02:54.506 Message: lib/ring: Defining dependency "ring" 00:02:54.506 Message: lib/rcu: Defining dependency "rcu" 00:02:54.506 Message: lib/mempool: Defining dependency "mempool" 00:02:54.506 Message: lib/mbuf: Defining dependency "mbuf" 00:02:54.506 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:54.506 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:54.506 Compiler for C supports arguments -mpclmul: YES 00:02:54.506 Compiler for C supports arguments -maes: YES 00:02:54.506 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:54.506 Compiler for C supports arguments -mavx512bw: YES 00:02:54.506 Compiler for C supports arguments -mavx512dq: YES 00:02:54.506 Compiler for C supports arguments -mavx512vl: YES 00:02:54.506 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:54.506 Compiler for C supports arguments -mavx2: YES 00:02:54.506 Compiler for C supports arguments -mavx: YES 00:02:54.506 Message: lib/net: Defining dependency "net" 00:02:54.506 Message: lib/meter: Defining dependency "meter" 00:02:54.506 Message: lib/ethdev: Defining dependency "ethdev" 00:02:54.506 Message: lib/pci: Defining dependency "pci" 00:02:54.506 Message: lib/cmdline: Defining dependency "cmdline" 00:02:54.506 Message: lib/metrics: Defining dependency "metrics" 00:02:54.506 Message: lib/hash: Defining dependency "hash" 00:02:54.506 Message: lib/timer: Defining dependency "timer" 00:02:54.506 Fetching value of define "__AVX2__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:54.506 Message: lib/acl: Defining dependency "acl" 00:02:54.506 Message: lib/bbdev: Defining dependency "bbdev" 00:02:54.506 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:54.506 Run-time dependency libelf found: YES 0.191 00:02:54.506 Message: lib/bpf: Defining dependency "bpf" 00:02:54.506 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:54.506 Message: lib/compressdev: Defining dependency "compressdev" 00:02:54.506 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:54.506 Message: lib/distributor: Defining dependency "distributor" 00:02:54.506 Message: lib/efd: Defining dependency "efd" 00:02:54.506 Message: lib/eventdev: Defining dependency "eventdev" 00:02:54.506 Message: lib/gpudev: Defining dependency "gpudev" 00:02:54.506 Message: lib/gro: Defining dependency "gro" 00:02:54.506 Message: lib/gso: Defining dependency "gso" 00:02:54.506 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:54.506 Message: lib/jobstats: Defining dependency "jobstats" 00:02:54.506 Message: lib/latencystats: Defining dependency "latencystats" 00:02:54.506 Message: lib/lpm: Defining dependency "lpm" 00:02:54.506 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:54.506 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:54.506 Message: lib/member: Defining dependency "member" 00:02:54.506 Message: lib/pcapng: Defining dependency "pcapng" 00:02:54.506 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:54.506 Message: lib/power: Defining dependency "power" 00:02:54.506 Message: lib/rawdev: Defining dependency "rawdev" 00:02:54.506 Message: lib/regexdev: Defining dependency "regexdev" 00:02:54.506 Message: lib/dmadev: Defining dependency "dmadev" 00:02:54.506 Message: lib/rib: Defining dependency "rib" 00:02:54.506 Message: lib/reorder: Defining dependency "reorder" 00:02:54.506 Message: lib/sched: Defining dependency "sched" 00:02:54.506 Message: lib/security: Defining dependency "security" 00:02:54.506 Message: lib/stack: Defining dependency "stack" 00:02:54.506 Has header "linux/userfaultfd.h" : YES 00:02:54.506 Message: lib/vhost: Defining dependency "vhost" 00:02:54.506 Message: lib/ipsec: Defining dependency "ipsec" 00:02:54.506 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:54.506 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:54.506 Message: lib/fib: Defining dependency "fib" 00:02:54.506 Message: lib/port: Defining dependency "port" 00:02:54.506 Message: lib/pdump: Defining dependency "pdump" 00:02:54.506 Message: lib/table: Defining dependency "table" 00:02:54.506 Message: lib/pipeline: Defining dependency "pipeline" 00:02:54.506 Message: lib/graph: Defining dependency "graph" 00:02:54.506 Message: lib/node: Defining dependency "node" 00:02:54.506 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:54.506 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:54.506 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:54.506 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:54.506 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:54.506 Compiler for C supports arguments -Wno-unused-value: YES 00:02:54.506 Compiler for C supports arguments -Wno-format: YES 00:02:54.506 Compiler for C supports arguments -Wno-format-security: YES 00:02:54.506 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:55.081 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:55.081 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:55.081 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:55.081 Fetching value of define "__AVX2__" : 1 (cached) 00:02:55.081 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:55.081 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:55.081 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:55.081 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:55.081 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:55.081 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:55.081 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:55.081 Configuring doxy-api.conf using configuration 00:02:55.081 Program sphinx-build found: NO 00:02:55.081 Configuring rte_build_config.h using configuration 00:02:55.081 Message: 00:02:55.081 ================= 00:02:55.081 Applications Enabled 00:02:55.081 ================= 00:02:55.081 00:02:55.081 apps: 00:02:55.081 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:02:55.081 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:02:55.081 test-security-perf, 00:02:55.081 00:02:55.081 Message: 00:02:55.081 ================= 00:02:55.081 Libraries Enabled 00:02:55.081 ================= 00:02:55.081 00:02:55.081 libs: 00:02:55.081 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:02:55.081 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:02:55.081 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:02:55.081 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:02:55.081 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:02:55.081 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:02:55.081 table, pipeline, graph, node, 00:02:55.081 00:02:55.081 Message: 00:02:55.081 =============== 00:02:55.081 Drivers Enabled 00:02:55.081 =============== 00:02:55.081 00:02:55.081 common: 00:02:55.081 00:02:55.081 bus: 00:02:55.081 pci, vdev, 00:02:55.081 mempool: 00:02:55.081 ring, 00:02:55.081 dma: 00:02:55.081 00:02:55.081 net: 00:02:55.081 i40e, 00:02:55.081 raw: 00:02:55.081 00:02:55.081 crypto: 00:02:55.081 00:02:55.081 compress: 00:02:55.081 00:02:55.081 regex: 00:02:55.081 00:02:55.081 vdpa: 00:02:55.081 00:02:55.081 event: 00:02:55.081 00:02:55.081 baseband: 00:02:55.081 00:02:55.081 gpu: 00:02:55.081 00:02:55.081 00:02:55.081 Message: 00:02:55.081 ================= 00:02:55.081 Content Skipped 00:02:55.081 ================= 00:02:55.081 00:02:55.081 apps: 00:02:55.081 00:02:55.081 libs: 00:02:55.081 kni: explicitly disabled via build config (deprecated lib) 00:02:55.081 flow_classify: explicitly disabled via build config (deprecated lib) 00:02:55.081 00:02:55.081 drivers: 00:02:55.081 common/cpt: not in enabled drivers build config 00:02:55.081 common/dpaax: not in enabled drivers build config 00:02:55.081 common/iavf: not in enabled drivers build config 00:02:55.081 common/idpf: not in enabled drivers build config 00:02:55.081 common/mvep: not in enabled drivers build config 00:02:55.081 common/octeontx: not in enabled drivers build config 00:02:55.081 bus/auxiliary: not in enabled drivers build config 00:02:55.081 bus/dpaa: not in enabled drivers build config 00:02:55.081 bus/fslmc: not in enabled drivers build config 00:02:55.081 bus/ifpga: not in enabled drivers build config 00:02:55.081 bus/vmbus: not in enabled drivers build config 00:02:55.081 common/cnxk: not in enabled drivers build config 00:02:55.081 common/mlx5: not in enabled drivers build config 00:02:55.081 common/qat: not in enabled drivers build config 00:02:55.081 common/sfc_efx: not in enabled drivers build config 00:02:55.081 mempool/bucket: not in enabled drivers build config 00:02:55.082 mempool/cnxk: not in enabled drivers build config 00:02:55.082 mempool/dpaa: not in enabled drivers build config 00:02:55.082 mempool/dpaa2: not in enabled drivers build config 00:02:55.082 mempool/octeontx: not in enabled drivers build config 00:02:55.082 mempool/stack: not in enabled drivers build config 00:02:55.082 dma/cnxk: not in enabled drivers build config 00:02:55.082 dma/dpaa: not in enabled drivers build config 00:02:55.082 dma/dpaa2: not in enabled drivers build config 00:02:55.082 dma/hisilicon: not in enabled drivers build config 00:02:55.082 dma/idxd: not in enabled drivers build config 00:02:55.082 dma/ioat: not in enabled drivers build config 00:02:55.082 dma/skeleton: not in enabled drivers build config 00:02:55.082 net/af_packet: not in enabled drivers build config 00:02:55.082 net/af_xdp: not in enabled drivers build config 00:02:55.082 net/ark: not in enabled drivers build config 00:02:55.082 net/atlantic: not in enabled drivers build config 00:02:55.082 net/avp: not in enabled drivers build config 00:02:55.082 net/axgbe: not in enabled drivers build config 00:02:55.082 net/bnx2x: not in enabled drivers build config 00:02:55.082 net/bnxt: not in enabled drivers build config 00:02:55.082 net/bonding: not in enabled drivers build config 00:02:55.082 net/cnxk: not in enabled drivers build config 00:02:55.082 net/cxgbe: not in enabled drivers build config 00:02:55.082 net/dpaa: not in enabled drivers build config 00:02:55.082 net/dpaa2: not in enabled drivers build config 00:02:55.082 net/e1000: not in enabled drivers build config 00:02:55.082 net/ena: not in enabled drivers build config 00:02:55.082 net/enetc: not in enabled drivers build config 00:02:55.082 net/enetfec: not in enabled drivers build config 00:02:55.082 net/enic: not in enabled drivers build config 00:02:55.082 net/failsafe: not in enabled drivers build config 00:02:55.082 net/fm10k: not in enabled drivers build config 00:02:55.082 net/gve: not in enabled drivers build config 00:02:55.082 net/hinic: not in enabled drivers build config 00:02:55.082 net/hns3: not in enabled drivers build config 00:02:55.082 net/iavf: not in enabled drivers build config 00:02:55.082 net/ice: not in enabled drivers build config 00:02:55.082 net/idpf: not in enabled drivers build config 00:02:55.082 net/igc: not in enabled drivers build config 00:02:55.082 net/ionic: not in enabled drivers build config 00:02:55.082 net/ipn3ke: not in enabled drivers build config 00:02:55.082 net/ixgbe: not in enabled drivers build config 00:02:55.082 net/kni: not in enabled drivers build config 00:02:55.082 net/liquidio: not in enabled drivers build config 00:02:55.082 net/mana: not in enabled drivers build config 00:02:55.082 net/memif: not in enabled drivers build config 00:02:55.082 net/mlx4: not in enabled drivers build config 00:02:55.082 net/mlx5: not in enabled drivers build config 00:02:55.082 net/mvneta: not in enabled drivers build config 00:02:55.082 net/mvpp2: not in enabled drivers build config 00:02:55.082 net/netvsc: not in enabled drivers build config 00:02:55.082 net/nfb: not in enabled drivers build config 00:02:55.082 net/nfp: not in enabled drivers build config 00:02:55.082 net/ngbe: not in enabled drivers build config 00:02:55.082 net/null: not in enabled drivers build config 00:02:55.082 net/octeontx: not in enabled drivers build config 00:02:55.082 net/octeon_ep: not in enabled drivers build config 00:02:55.082 net/pcap: not in enabled drivers build config 00:02:55.082 net/pfe: not in enabled drivers build config 00:02:55.082 net/qede: not in enabled drivers build config 00:02:55.082 net/ring: not in enabled drivers build config 00:02:55.082 net/sfc: not in enabled drivers build config 00:02:55.082 net/softnic: not in enabled drivers build config 00:02:55.082 net/tap: not in enabled drivers build config 00:02:55.082 net/thunderx: not in enabled drivers build config 00:02:55.082 net/txgbe: not in enabled drivers build config 00:02:55.082 net/vdev_netvsc: not in enabled drivers build config 00:02:55.082 net/vhost: not in enabled drivers build config 00:02:55.082 net/virtio: not in enabled drivers build config 00:02:55.082 net/vmxnet3: not in enabled drivers build config 00:02:55.082 raw/cnxk_bphy: not in enabled drivers build config 00:02:55.082 raw/cnxk_gpio: not in enabled drivers build config 00:02:55.082 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:55.082 raw/ifpga: not in enabled drivers build config 00:02:55.082 raw/ntb: not in enabled drivers build config 00:02:55.082 raw/skeleton: not in enabled drivers build config 00:02:55.082 crypto/armv8: not in enabled drivers build config 00:02:55.082 crypto/bcmfs: not in enabled drivers build config 00:02:55.082 crypto/caam_jr: not in enabled drivers build config 00:02:55.082 crypto/ccp: not in enabled drivers build config 00:02:55.082 crypto/cnxk: not in enabled drivers build config 00:02:55.082 crypto/dpaa_sec: not in enabled drivers build config 00:02:55.082 crypto/dpaa2_sec: not in enabled drivers build config 00:02:55.082 crypto/ipsec_mb: not in enabled drivers build config 00:02:55.082 crypto/mlx5: not in enabled drivers build config 00:02:55.082 crypto/mvsam: not in enabled drivers build config 00:02:55.082 crypto/nitrox: not in enabled drivers build config 00:02:55.082 crypto/null: not in enabled drivers build config 00:02:55.082 crypto/octeontx: not in enabled drivers build config 00:02:55.082 crypto/openssl: not in enabled drivers build config 00:02:55.082 crypto/scheduler: not in enabled drivers build config 00:02:55.082 crypto/uadk: not in enabled drivers build config 00:02:55.082 crypto/virtio: not in enabled drivers build config 00:02:55.082 compress/isal: not in enabled drivers build config 00:02:55.082 compress/mlx5: not in enabled drivers build config 00:02:55.082 compress/octeontx: not in enabled drivers build config 00:02:55.082 compress/zlib: not in enabled drivers build config 00:02:55.082 regex/mlx5: not in enabled drivers build config 00:02:55.082 regex/cn9k: not in enabled drivers build config 00:02:55.082 vdpa/ifc: not in enabled drivers build config 00:02:55.082 vdpa/mlx5: not in enabled drivers build config 00:02:55.082 vdpa/sfc: not in enabled drivers build config 00:02:55.082 event/cnxk: not in enabled drivers build config 00:02:55.082 event/dlb2: not in enabled drivers build config 00:02:55.082 event/dpaa: not in enabled drivers build config 00:02:55.082 event/dpaa2: not in enabled drivers build config 00:02:55.082 event/dsw: not in enabled drivers build config 00:02:55.082 event/opdl: not in enabled drivers build config 00:02:55.082 event/skeleton: not in enabled drivers build config 00:02:55.082 event/sw: not in enabled drivers build config 00:02:55.082 event/octeontx: not in enabled drivers build config 00:02:55.082 baseband/acc: not in enabled drivers build config 00:02:55.082 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:55.082 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:55.082 baseband/la12xx: not in enabled drivers build config 00:02:55.082 baseband/null: not in enabled drivers build config 00:02:55.082 baseband/turbo_sw: not in enabled drivers build config 00:02:55.082 gpu/cuda: not in enabled drivers build config 00:02:55.082 00:02:55.082 00:02:55.082 Build targets in project: 311 00:02:55.082 00:02:55.082 DPDK 22.11.4 00:02:55.082 00:02:55.082 User defined options 00:02:55.082 libdir : lib 00:02:55.082 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:55.082 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:55.082 c_link_args : 00:02:55.082 enable_docs : false 00:02:55.082 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:55.082 enable_kmods : false 00:02:55.082 machine : native 00:02:55.082 tests : false 00:02:55.082 00:02:55.082 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:55.082 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:55.083 08:20:47 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:55.083 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:55.083 [1/740] Generating lib/rte_kvargs_def with a custom command 00:02:55.083 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:02:55.083 [3/740] Generating lib/rte_telemetry_def with a custom command 00:02:55.083 [4/740] Generating lib/rte_telemetry_mingw with a custom command 00:02:55.083 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:55.083 [6/740] Generating lib/rte_rcu_def with a custom command 00:02:55.083 [7/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:55.083 [8/740] Generating lib/rte_ring_def with a custom command 00:02:55.083 [9/740] Generating lib/rte_ring_mingw with a custom command 00:02:55.083 [10/740] Generating lib/rte_eal_def with a custom command 00:02:55.083 [11/740] Generating lib/rte_eal_mingw with a custom command 00:02:55.083 [12/740] Generating lib/rte_rcu_mingw with a custom command 00:02:55.083 [13/740] Generating lib/rte_mempool_def with a custom command 00:02:55.083 [14/740] Generating lib/rte_mempool_mingw with a custom command 00:02:55.083 [15/740] Generating lib/rte_mbuf_mingw with a custom command 00:02:55.083 [16/740] Generating lib/rte_net_def with a custom command 00:02:55.083 [17/740] Generating lib/rte_mbuf_def with a custom command 00:02:55.083 [18/740] Generating lib/rte_net_mingw with a custom command 00:02:55.083 [19/740] Generating lib/rte_meter_def with a custom command 00:02:55.083 [20/740] Generating lib/rte_meter_mingw with a custom command 00:02:55.083 [21/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:55.342 [22/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:55.342 [23/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:55.342 [24/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:55.342 [25/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:55.342 [26/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:02:55.342 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:55.342 [28/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:55.342 [29/740] Generating lib/rte_ethdev_mingw with a custom command 00:02:55.342 [30/740] Generating lib/rte_pci_mingw with a custom command 00:02:55.343 [31/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:55.343 [32/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:55.343 [33/740] Generating lib/rte_pci_def with a custom command 00:02:55.343 [34/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:55.343 [35/740] Generating lib/rte_ethdev_def with a custom command 00:02:55.343 [36/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:55.343 [37/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:55.343 [38/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:55.343 [39/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:55.343 [40/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:55.343 [41/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:55.343 [42/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:55.343 [43/740] Generating lib/rte_metrics_mingw with a custom command 00:02:55.343 [44/740] Generating lib/rte_metrics_def with a custom command 00:02:55.343 [45/740] Generating lib/rte_cmdline_mingw with a custom command 00:02:55.343 [46/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:55.343 [47/740] Generating lib/rte_cmdline_def with a custom command 00:02:55.343 [48/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:55.343 [49/740] Linking static target lib/librte_kvargs.a 00:02:55.343 [50/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:55.343 [51/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:55.343 [52/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:55.343 [53/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:55.343 [54/740] Generating lib/rte_hash_def with a custom command 00:02:55.343 [55/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:55.343 [56/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:55.343 [57/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:55.343 [58/740] Generating lib/rte_timer_def with a custom command 00:02:55.343 [59/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:55.343 [60/740] Generating lib/rte_hash_mingw with a custom command 00:02:55.343 [61/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:55.343 [62/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:55.343 [63/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:55.343 [64/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:55.343 [65/740] Generating lib/rte_timer_mingw with a custom command 00:02:55.343 [66/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:55.343 [67/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:55.343 [68/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:55.343 [69/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:55.343 [70/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:55.343 [71/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:55.343 [72/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:55.343 [73/740] Generating lib/rte_acl_def with a custom command 00:02:55.343 [74/740] Generating lib/rte_acl_mingw with a custom command 00:02:55.343 [75/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:55.343 [76/740] Generating lib/rte_bbdev_mingw with a custom command 00:02:55.343 [77/740] Generating lib/rte_bitratestats_def with a custom command 00:02:55.343 [78/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:55.343 [79/740] Linking static target lib/librte_pci.a 00:02:55.343 [80/740] Generating lib/rte_bbdev_def with a custom command 00:02:55.343 [81/740] Generating lib/rte_bitratestats_mingw with a custom command 00:02:55.343 [82/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:55.343 [83/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:55.343 [84/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:55.343 [85/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:55.343 [86/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:55.343 [87/740] Linking static target lib/librte_meter.a 00:02:55.343 [88/740] Generating lib/rte_bpf_def with a custom command 00:02:55.343 [89/740] Generating lib/rte_bpf_mingw with a custom command 00:02:55.343 [90/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:55.608 [91/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:55.608 [92/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:55.608 [93/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:55.608 [94/740] Generating lib/rte_cfgfile_mingw with a custom command 00:02:55.608 [95/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:55.608 [96/740] Generating lib/rte_cfgfile_def with a custom command 00:02:55.608 [97/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:55.608 [98/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:55.608 [99/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:55.608 [100/740] Generating lib/rte_compressdev_mingw with a custom command 00:02:55.608 [101/740] Generating lib/rte_compressdev_def with a custom command 00:02:55.608 [102/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:55.608 [103/740] Linking static target lib/librte_ring.a 00:02:55.608 [104/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:55.608 [105/740] Generating lib/rte_cryptodev_def with a custom command 00:02:55.608 [106/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:55.608 [107/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:55.608 [108/740] Generating lib/rte_cryptodev_mingw with a custom command 00:02:55.608 [109/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:55.608 [110/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:02:55.608 [111/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:55.608 [112/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:55.608 [113/740] Generating lib/rte_efd_def with a custom command 00:02:55.608 [114/740] Generating lib/rte_distributor_def with a custom command 00:02:55.608 [115/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:55.608 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:55.608 [117/740] Generating lib/rte_efd_mingw with a custom command 00:02:55.608 [118/740] Generating lib/rte_distributor_mingw with a custom command 00:02:55.608 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:55.608 [120/740] Generating lib/rte_eventdev_mingw with a custom command 00:02:55.608 [121/740] Generating lib/rte_gpudev_def with a custom command 00:02:55.608 [122/740] Generating lib/rte_eventdev_def with a custom command 00:02:55.608 [123/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:55.608 [124/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:55.608 [125/740] Generating lib/rte_gpudev_mingw with a custom command 00:02:55.608 [126/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:55.608 [127/740] Generating lib/rte_gro_def with a custom command 00:02:55.608 [128/740] Generating lib/rte_gro_mingw with a custom command 00:02:55.608 [129/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:55.608 [130/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:55.608 [131/740] Generating lib/rte_gso_def with a custom command 00:02:55.608 [132/740] Generating lib/rte_gso_mingw with a custom command 00:02:55.608 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:55.608 [134/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:55.868 [135/740] Generating lib/rte_ip_frag_mingw with a custom command 00:02:55.868 [136/740] Generating lib/rte_ip_frag_def with a custom command 00:02:55.868 [137/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:55.868 [138/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:55.868 [139/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:55.868 [140/740] Generating lib/rte_jobstats_def with a custom command 00:02:55.868 [141/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:55.868 [142/740] Linking target lib/librte_kvargs.so.23.0 00:02:55.868 [143/740] Generating lib/rte_jobstats_mingw with a custom command 00:02:55.868 [144/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:55.868 [145/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:55.868 [146/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:55.868 [147/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:55.868 [148/740] Generating lib/rte_latencystats_def with a custom command 00:02:55.868 [149/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:55.868 [150/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:55.868 [151/740] Generating lib/rte_latencystats_mingw with a custom command 00:02:55.868 [152/740] Linking static target lib/librte_cfgfile.a 00:02:55.868 [153/740] Generating lib/rte_lpm_def with a custom command 00:02:55.868 [154/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:55.868 [155/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:55.868 [156/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:55.868 [157/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:55.868 [158/740] Generating lib/rte_lpm_mingw with a custom command 00:02:55.868 [159/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:55.868 [160/740] Generating lib/rte_member_def with a custom command 00:02:55.868 [161/740] Generating lib/rte_member_mingw with a custom command 00:02:55.868 [162/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:55.868 [163/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:55.868 [164/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:55.869 [165/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:55.869 [166/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:55.869 [167/740] Generating lib/rte_pcapng_def with a custom command 00:02:55.869 [168/740] Generating lib/rte_pcapng_mingw with a custom command 00:02:55.869 [169/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:55.869 [170/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:55.869 [171/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:55.869 [172/740] Linking static target lib/librte_jobstats.a 00:02:55.869 [173/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:55.869 [174/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:55.869 [175/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:55.869 [176/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:56.135 [177/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:56.135 [178/740] Linking static target lib/librte_cmdline.a 00:02:56.135 [179/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:56.135 [180/740] Generating lib/rte_power_def with a custom command 00:02:56.135 [181/740] Generating lib/rte_power_mingw with a custom command 00:02:56.135 [182/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:56.135 [183/740] Linking static target lib/librte_metrics.a 00:02:56.135 [184/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:56.135 [185/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:56.135 [186/740] Linking static target lib/librte_timer.a 00:02:56.135 [187/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:56.135 [188/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:56.135 [189/740] Generating lib/rte_rawdev_def with a custom command 00:02:56.135 [190/740] Generating lib/rte_rawdev_mingw with a custom command 00:02:56.135 [191/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:02:56.135 [192/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:56.135 [193/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:56.135 [194/740] Linking static target lib/librte_telemetry.a 00:02:56.135 [195/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:56.135 [196/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:56.135 [197/740] Generating lib/rte_regexdev_def with a custom command 00:02:56.135 [198/740] Generating lib/rte_regexdev_mingw with a custom command 00:02:56.135 [199/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:56.135 [200/740] Generating lib/rte_dmadev_def with a custom command 00:02:56.135 [201/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:56.135 [202/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:56.135 [203/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:56.135 [204/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:56.135 [205/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:56.135 [206/740] Generating lib/rte_rib_def with a custom command 00:02:56.135 [207/740] Generating lib/rte_dmadev_mingw with a custom command 00:02:56.135 [208/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:56.135 [209/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:56.135 [210/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:56.135 [211/740] Generating lib/rte_rib_mingw with a custom command 00:02:56.135 [212/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:56.135 [213/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:56.135 [214/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:56.135 [215/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:56.135 [216/740] Generating lib/rte_reorder_mingw with a custom command 00:02:56.135 [217/740] Generating lib/rte_reorder_def with a custom command 00:02:56.135 [218/740] Linking static target lib/librte_net.a 00:02:56.135 [219/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:56.135 [220/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:56.135 [221/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:56.135 [222/740] Linking static target lib/librte_bitratestats.a 00:02:56.135 [223/740] Generating lib/rte_sched_mingw with a custom command 00:02:56.135 [224/740] Generating lib/rte_security_def with a custom command 00:02:56.135 [225/740] Generating lib/rte_security_mingw with a custom command 00:02:56.135 [226/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:56.135 [227/740] Generating lib/rte_stack_def with a custom command 00:02:56.135 [228/740] Generating lib/rte_stack_mingw with a custom command 00:02:56.135 [229/740] Generating lib/rte_sched_def with a custom command 00:02:56.135 [230/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:56.135 [231/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:56.135 [232/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:56.135 [233/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:56.135 [234/740] Generating lib/rte_vhost_def with a custom command 00:02:56.135 [235/740] Generating lib/rte_vhost_mingw with a custom command 00:02:56.135 [236/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:56.135 [237/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:56.135 [238/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:56.136 [239/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:56.136 [240/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:56.136 [241/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:56.136 [242/740] Generating lib/rte_ipsec_mingw with a custom command 00:02:56.136 [243/740] Generating lib/rte_ipsec_def with a custom command 00:02:56.136 [244/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:56.136 [245/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:56.136 [246/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:56.136 [247/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:56.397 [248/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:56.397 [249/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:56.397 [250/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:02:56.397 [251/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:56.397 [252/740] Generating lib/rte_fib_def with a custom command 00:02:56.397 [253/740] Generating lib/rte_fib_mingw with a custom command 00:02:56.397 [254/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:56.397 [255/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:56.397 [256/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:56.397 [257/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:56.397 [258/740] Linking static target lib/librte_stack.a 00:02:56.397 [259/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:56.397 [260/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:56.397 [261/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:56.397 [262/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:56.397 [263/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:56.397 [264/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:56.397 [265/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:56.397 [266/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:56.397 [267/740] Generating lib/rte_port_def with a custom command 00:02:56.397 [268/740] Generating lib/rte_port_mingw with a custom command 00:02:56.397 [269/740] Generating lib/rte_pdump_def with a custom command 00:02:56.397 [270/740] Linking static target lib/librte_compressdev.a 00:02:56.397 [271/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:56.397 [272/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.397 [273/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:56.397 [274/740] Generating lib/rte_pdump_mingw with a custom command 00:02:56.397 [275/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:56.397 [276/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:56.397 [277/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:56.397 [278/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:56.397 [279/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.397 [280/740] Linking static target lib/librte_rcu.a 00:02:56.397 [281/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:56.397 [282/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:56.397 [283/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:56.397 [284/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:56.397 [285/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.397 [286/740] Linking static target lib/librte_rawdev.a 00:02:56.397 [287/740] Linking static target lib/librte_mempool.a 00:02:56.659 [288/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:56.659 [289/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:56.659 [290/740] Generating lib/rte_table_def with a custom command 00:02:56.659 [291/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:56.659 [292/740] Linking static target lib/librte_bbdev.a 00:02:56.659 [293/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:56.659 [294/740] Generating lib/rte_table_mingw with a custom command 00:02:56.659 [295/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.659 [296/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:56.659 [297/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:56.659 [298/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:56.659 [299/740] Linking static target lib/librte_gpudev.a 00:02:56.659 [300/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:56.659 [301/740] Linking static target lib/librte_dmadev.a 00:02:56.659 [302/740] Linking static target lib/librte_gro.a 00:02:56.659 [303/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:56.659 [304/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:56.659 [305/740] Generating lib/rte_pipeline_def with a custom command 00:02:56.659 [306/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:56.659 [307/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.659 [308/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:56.659 [309/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.659 [310/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.659 [311/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.659 [312/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:56.659 [313/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:56.659 [314/740] Generating lib/rte_pipeline_mingw with a custom command 00:02:56.659 [315/740] Linking static target lib/librte_gso.a 00:02:56.659 [316/740] Linking static target lib/librte_latencystats.a 00:02:56.659 [317/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:02:56.659 [318/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:56.659 [319/740] Linking target lib/librte_telemetry.so.23.0 00:02:56.659 [320/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:56.659 [321/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:56.659 [322/740] Generating lib/rte_graph_def with a custom command 00:02:56.659 [323/740] Generating lib/rte_graph_mingw with a custom command 00:02:56.659 [324/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:56.659 [325/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:56.659 [326/740] Linking static target lib/librte_distributor.a 00:02:56.659 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:56.659 [328/740] Linking static target lib/librte_ip_frag.a 00:02:56.922 [329/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:56.922 [330/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:56.922 [331/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:56.922 [332/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:56.922 [333/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:56.922 [334/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:56.922 [335/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:56.922 [336/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:56.922 [337/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:56.922 [338/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:56.922 [339/740] Linking static target lib/librte_regexdev.a 00:02:56.922 [340/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:56.922 [341/740] Generating lib/rte_node_def with a custom command 00:02:56.922 [342/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:56.922 [343/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:02:56.922 [344/740] Generating lib/rte_node_mingw with a custom command 00:02:56.922 [345/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:56.922 [346/740] Linking static target lib/librte_eal.a 00:02:56.922 [347/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.922 [348/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.922 [349/740] Generating drivers/rte_bus_pci_def with a custom command 00:02:56.922 [350/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:56.922 [351/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:56.922 [352/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:56.922 [353/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:02:56.922 [354/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:56.922 [355/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.922 [356/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:56.922 [357/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:56.922 [358/740] Linking static target lib/librte_power.a 00:02:56.922 [359/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:56.922 [360/740] Linking static target lib/librte_reorder.a 00:02:56.922 [361/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:56.922 [362/740] Generating drivers/rte_bus_vdev_def with a custom command 00:02:56.922 [363/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:02:56.922 [364/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:56.922 [365/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:02:56.922 [366/740] Generating drivers/rte_mempool_ring_def with a custom command 00:02:56.922 [367/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:57.184 [368/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:57.184 [369/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:57.184 [370/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:57.184 [371/740] Linking static target lib/librte_security.a 00:02:57.184 [372/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:57.184 [373/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:57.184 [374/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:57.184 [375/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:57.184 [376/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.184 [377/740] Linking static target lib/librte_pcapng.a 00:02:57.184 [378/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:57.184 [379/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:57.184 [380/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:57.184 [381/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:57.184 [382/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:57.184 [383/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.184 [384/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:57.184 [385/740] Linking static target lib/librte_mbuf.a 00:02:57.184 [386/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:57.184 [387/740] Generating drivers/rte_net_i40e_def with a custom command 00:02:57.184 [388/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.184 [389/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:57.184 [390/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:57.184 [391/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:02:57.184 [392/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:57.184 [393/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:57.184 [394/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:57.184 [395/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:57.184 [396/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:57.446 [397/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:57.446 [398/740] Linking static target lib/librte_bpf.a 00:02:57.446 [399/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:57.446 [400/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:57.446 [401/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:57.446 [402/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:57.446 [403/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:57.446 [404/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:57.446 [405/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:57.446 [406/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:57.446 [407/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:57.446 [408/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.446 [409/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:57.446 [410/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:57.446 [411/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:57.446 [412/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:57.446 [413/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:57.446 [414/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:57.446 [415/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:57.446 [416/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:57.446 [417/740] Linking static target lib/librte_lpm.a 00:02:57.446 [418/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:57.446 [419/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.446 [420/740] Linking static target lib/librte_rib.a 00:02:57.446 [421/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.446 [422/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:57.446 [423/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:57.446 [424/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:57.446 [425/740] Linking static target lib/librte_graph.a 00:02:57.446 [426/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:57.446 [427/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:57.446 [428/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:57.446 [429/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:57.446 [430/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.446 [431/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:57.446 [432/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:57.446 [433/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:57.446 [434/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:57.446 [435/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:57.446 [436/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:57.707 [437/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:57.707 [438/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:57.707 [439/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:57.707 [440/740] Linking static target lib/librte_efd.a 00:02:57.707 [441/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:57.707 [442/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:57.707 [443/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:57.707 [444/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:57.707 [445/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:57.707 [446/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:57.707 [447/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.707 [448/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:57.707 [449/740] Linking static target drivers/librte_bus_vdev.a 00:02:57.707 [450/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:57.707 [451/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:57.707 [452/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:57.707 [453/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:57.707 [454/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:57.707 [455/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:57.707 [456/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:57.707 [457/740] Linking static target lib/librte_fib.a 00:02:57.707 [458/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.707 [459/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.970 [460/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.970 [461/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.970 [462/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:57.970 [463/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.970 [464/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:57.970 [465/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:57.970 [466/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.970 [467/740] Linking static target lib/librte_pdump.a 00:02:57.970 [468/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:57.970 [469/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:57.970 [470/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:57.970 [471/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:57.970 [472/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:57.970 [473/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:58.228 [474/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.228 [475/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:58.228 [476/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:58.228 [477/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:58.228 [478/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:58.228 [479/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:58.228 [480/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:58.228 [481/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.228 [482/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:58.228 [483/740] Linking static target drivers/librte_bus_pci.a 00:02:58.228 [484/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.228 [485/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:58.228 [486/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:58.228 [487/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.229 [488/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:58.229 [489/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:58.229 [490/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:58.229 [491/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:58.229 [492/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:58.229 [493/740] Linking static target lib/librte_table.a 00:02:58.229 [494/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:58.229 [495/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:58.488 [496/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:58.488 [497/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:58.488 [498/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:58.488 [499/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.488 [500/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:58.488 [501/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:58.488 [502/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:58.488 [503/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:58.488 [504/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:58.488 [505/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:58.488 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:58.488 [507/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.488 [508/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:58.488 [509/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:58.488 [510/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:58.488 [511/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:58.488 [512/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:58.488 [513/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.488 [514/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:58.488 [515/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:58.488 [516/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:58.488 [517/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:58.488 [518/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:58.488 [519/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:58.488 [520/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:58.488 [521/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:58.746 [522/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:58.746 [523/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:58.746 [524/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:58.746 [525/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:58.746 [526/740] Linking static target lib/librte_cryptodev.a 00:02:58.746 [527/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:58.747 [528/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:58.747 [529/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:58.747 [530/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:58.747 [531/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:58.747 [532/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.747 [533/740] Linking static target lib/librte_sched.a 00:02:58.747 [534/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:58.747 [535/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:58.747 [536/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:58.747 [537/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:58.747 [538/740] Linking static target lib/librte_node.a 00:02:58.747 [539/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:58.747 [540/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:58.747 [541/740] Linking static target lib/librte_ipsec.a 00:02:58.747 [542/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:58.747 [543/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:58.747 [544/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:58.747 [545/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:58.747 [546/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:58.747 [547/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:58.747 [548/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:58.747 [549/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:58.747 [550/740] Linking static target lib/librte_ethdev.a 00:02:59.006 [551/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:59.006 [552/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:59.006 [553/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:59.006 [554/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:59.006 [555/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:59.006 [556/740] Linking static target drivers/librte_mempool_ring.a 00:02:59.006 [557/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:59.006 [558/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:59.006 [559/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:59.006 [560/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:59.006 [561/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:59.006 [562/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:59.006 [563/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:59.006 [564/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:59.006 [565/740] Linking static target lib/librte_port.a 00:02:59.006 [566/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:59.006 [567/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:59.006 [568/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:59.006 [569/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:59.006 [570/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:59.006 [571/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:59.006 [572/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:59.006 [573/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:59.006 [574/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:59.006 [575/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:59.006 [576/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:59.006 [577/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:59.006 [578/740] Linking static target lib/librte_member.a 00:02:59.006 [579/740] Linking static target lib/librte_eventdev.a 00:02:59.006 [580/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:59.006 [581/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:59.006 [582/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:59.006 [583/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:59.006 [584/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:59.265 [585/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:59.265 [586/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:59.265 [587/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:59.265 [588/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:59.265 [589/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:59.265 [590/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:59.265 [591/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:59.265 [592/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:59.265 [593/740] Linking static target lib/librte_hash.a 00:02:59.265 [594/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:59.265 [595/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:59.265 [596/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:02:59.265 [597/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:59.265 [598/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:59.265 [599/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:59.265 [600/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:59.265 [601/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:59.524 [602/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:59.524 [603/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:59.524 [604/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:59.524 [605/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:59.524 [606/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:02:59.524 [607/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:59.524 [608/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:59.524 [609/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:59.783 [610/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:59.783 [611/740] Linking static target lib/librte_acl.a 00:02:59.783 [612/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:59.783 [613/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:03:00.043 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:03:00.043 [615/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:03:00.043 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:03:00.302 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:03:00.302 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:03:00.561 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:03:00.561 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:03:01.128 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:03:01.396 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:03:01.396 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:03:01.657 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:03:01.948 [625/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:03:01.948 [626/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:03:01.948 [627/740] Linking static target drivers/librte_net_i40e.a 00:03:02.249 [628/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:03:02.249 [629/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:02.249 [630/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:02.506 [631/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:03:02.506 [632/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:03:03.071 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:03:07.262 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:08.199 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:03:08.458 [636/740] Linking static target lib/librte_vhost.a 00:03:09.033 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:03:09.033 [638/740] Linking static target lib/librte_pipeline.a 00:03:09.602 [639/740] Linking target app/dpdk-pdump 00:03:09.602 [640/740] Linking target app/dpdk-test-acl 00:03:09.602 [641/740] Linking target app/dpdk-test-sad 00:03:09.602 [642/740] Linking target app/dpdk-test-cmdline 00:03:09.602 [643/740] Linking target app/dpdk-test-fib 00:03:09.602 [644/740] Linking target app/dpdk-dumpcap 00:03:09.602 [645/740] Linking target app/dpdk-proc-info 00:03:09.602 [646/740] Linking target app/dpdk-test-gpudev 00:03:09.602 [647/740] Linking target app/dpdk-test-bbdev 00:03:09.603 [648/740] Linking target app/dpdk-test-pipeline 00:03:09.603 [649/740] Linking target app/dpdk-test-regex 00:03:09.603 [650/740] Linking target app/dpdk-test-compress-perf 00:03:09.603 [651/740] Linking target app/dpdk-test-flow-perf 00:03:09.603 [652/740] Linking target app/dpdk-test-crypto-perf 00:03:09.603 [653/740] Linking target app/dpdk-test-security-perf 00:03:09.603 [654/740] Linking target app/dpdk-test-eventdev 00:03:09.603 [655/740] Linking target app/dpdk-testpmd 00:03:10.542 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:03:10.542 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:03:10.542 [658/740] Linking target lib/librte_eal.so.23.0 00:03:10.802 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:03:10.802 [660/740] Linking target lib/librte_dmadev.so.23.0 00:03:10.802 [661/740] Linking target lib/librte_stack.so.23.0 00:03:10.802 [662/740] Linking target lib/librte_ring.so.23.0 00:03:10.802 [663/740] Linking target lib/librte_meter.so.23.0 00:03:10.802 [664/740] Linking target lib/librte_pci.so.23.0 00:03:10.802 [665/740] Linking target lib/librte_graph.so.23.0 00:03:10.802 [666/740] Linking target lib/librte_timer.so.23.0 00:03:10.802 [667/740] Linking target lib/librte_cfgfile.so.23.0 00:03:10.802 [668/740] Linking target lib/librte_jobstats.so.23.0 00:03:10.802 [669/740] Linking target lib/librte_rawdev.so.23.0 00:03:10.802 [670/740] Linking target drivers/librte_bus_vdev.so.23.0 00:03:10.802 [671/740] Linking target lib/librte_acl.so.23.0 00:03:11.061 [672/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:03:11.061 [673/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:03:11.061 [674/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:03:11.061 [675/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:03:11.061 [676/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:03:11.061 [677/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:03:11.061 [678/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:03:11.061 [679/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:03:11.061 [680/740] Linking target lib/librte_rcu.so.23.0 00:03:11.061 [681/740] Linking target lib/librte_mempool.so.23.0 00:03:11.062 [682/740] Linking target drivers/librte_bus_pci.so.23.0 00:03:11.062 [683/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:03:11.062 [684/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:03:11.062 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:03:11.321 [686/740] Linking target lib/librte_mbuf.so.23.0 00:03:11.321 [687/740] Linking target lib/librte_rib.so.23.0 00:03:11.321 [688/740] Linking target drivers/librte_mempool_ring.so.23.0 00:03:11.321 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:03:11.321 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:03:11.321 [691/740] Linking target lib/librte_fib.so.23.0 00:03:11.321 [692/740] Linking target lib/librte_net.so.23.0 00:03:11.321 [693/740] Linking target lib/librte_gpudev.so.23.0 00:03:11.321 [694/740] Linking target lib/librte_compressdev.so.23.0 00:03:11.321 [695/740] Linking target lib/librte_distributor.so.23.0 00:03:11.321 [696/740] Linking target lib/librte_bbdev.so.23.0 00:03:11.321 [697/740] Linking target lib/librte_regexdev.so.23.0 00:03:11.321 [698/740] Linking target lib/librte_reorder.so.23.0 00:03:11.321 [699/740] Linking target lib/librte_cryptodev.so.23.0 00:03:11.321 [700/740] Linking target lib/librte_sched.so.23.0 00:03:11.580 [701/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:03:11.581 [702/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:03:11.581 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:03:11.581 [704/740] Linking target lib/librte_hash.so.23.0 00:03:11.581 [705/740] Linking target lib/librte_cmdline.so.23.0 00:03:11.581 [706/740] Linking target lib/librte_ethdev.so.23.0 00:03:11.581 [707/740] Linking target lib/librte_security.so.23.0 00:03:11.581 [708/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:03:11.581 [709/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:03:11.840 [710/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:03:11.840 [711/740] Linking target lib/librte_member.so.23.0 00:03:11.840 [712/740] Linking target lib/librte_efd.so.23.0 00:03:11.840 [713/740] Linking target lib/librte_lpm.so.23.0 00:03:11.840 [714/740] Linking target lib/librte_bpf.so.23.0 00:03:11.840 [715/740] Linking target lib/librte_metrics.so.23.0 00:03:11.840 [716/740] Linking target lib/librte_gro.so.23.0 00:03:11.840 [717/740] Linking target lib/librte_gso.so.23.0 00:03:11.840 [718/740] Linking target lib/librte_pcapng.so.23.0 00:03:11.840 [719/740] Linking target lib/librte_ip_frag.so.23.0 00:03:11.840 [720/740] Linking target lib/librte_power.so.23.0 00:03:11.840 [721/740] Linking target lib/librte_eventdev.so.23.0 00:03:11.840 [722/740] Linking target lib/librte_ipsec.so.23.0 00:03:11.840 [723/740] Linking target lib/librte_vhost.so.23.0 00:03:11.840 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:03:11.840 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:03:11.840 [726/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:03:11.840 [727/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:03:11.840 [728/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:03:11.840 [729/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:03:11.840 [730/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:03:11.840 [731/740] Linking target lib/librte_node.so.23.0 00:03:11.840 [732/740] Linking target lib/librte_latencystats.so.23.0 00:03:11.840 [733/740] Linking target lib/librte_bitratestats.so.23.0 00:03:11.840 [734/740] Linking target lib/librte_pdump.so.23.0 00:03:12.099 [735/740] Linking target lib/librte_port.so.23.0 00:03:12.099 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:03:12.099 [737/740] Linking target lib/librte_table.so.23.0 00:03:12.358 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:03:14.266 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:03:14.266 [740/740] Linking target lib/librte_pipeline.so.23.0 00:03:14.266 08:21:06 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:03:14.266 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:03:14.266 [0/1] Installing files. 00:03:14.529 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:03:14.529 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.529 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.529 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.529 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.529 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.529 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.529 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.529 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.530 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:03:14.531 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.532 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:03:14.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:03:14.535 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.535 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.536 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:03:14.799 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:03:14.799 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:03:14.799 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:14.799 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:03:14.799 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.800 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.801 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.802 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:03:14.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:03:14.803 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:03:14.803 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:03:14.803 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:03:14.803 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:03:14.803 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:03:14.803 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:03:14.803 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:03:14.803 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:03:14.803 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:03:14.803 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:03:14.803 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:03:14.803 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:03:14.804 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:03:14.804 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:03:14.804 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:03:14.804 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:03:14.804 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:03:14.804 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:03:14.804 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:03:14.804 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:03:14.804 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:03:14.804 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:03:14.804 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:03:14.804 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:03:14.804 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:03:14.804 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:03:14.804 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:03:14.804 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:03:14.804 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:03:14.804 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:03:14.804 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:03:14.804 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:03:14.804 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:03:14.804 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:03:14.804 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:03:14.804 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:03:14.804 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:03:14.804 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:03:14.804 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:03:14.804 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:03:14.804 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:03:14.804 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:03:14.804 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:03:14.804 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:03:14.804 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:03:14.804 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:03:14.804 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:03:14.804 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:03:14.804 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:03:14.804 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:03:14.804 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:03:14.804 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:03:14.804 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:03:14.804 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:03:14.804 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:03:14.804 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:03:14.804 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:03:14.804 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:03:14.804 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:03:14.804 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:03:14.804 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:03:14.804 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:03:14.804 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:03:14.804 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:03:14.804 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:03:14.804 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:03:14.804 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:03:14.804 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:03:14.804 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:03:14.804 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:03:14.804 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:03:14.804 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:03:14.804 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:03:14.804 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:03:14.804 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:03:14.804 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:03:14.804 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:03:14.804 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:03:14.804 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:03:14.804 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:03:14.804 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:03:14.804 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:03:14.804 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:03:14.804 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:03:14.804 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:03:14.804 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:03:14.804 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:03:14.804 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:03:14.804 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:03:14.805 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:03:14.805 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:03:14.805 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:03:14.805 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:03:14.805 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:03:14.805 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:03:14.805 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:03:14.805 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:03:14.805 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:03:14.805 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:03:14.805 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:03:14.805 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:03:14.805 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:03:14.805 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:03:14.805 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:03:14.805 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:03:14.805 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:03:14.805 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:03:14.805 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:03:14.805 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:03:14.805 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:03:14.805 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:03:14.805 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:03:14.805 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:03:14.805 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:03:14.805 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:03:14.805 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:03:14.805 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:03:14.805 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:03:14.805 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:03:14.805 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:03:14.805 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:03:14.805 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:03:14.805 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:03:14.805 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:03:14.805 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:03:14.805 08:21:07 -- common/autobuild_common.sh@192 -- $ uname -s 00:03:14.805 08:21:07 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:03:14.805 08:21:07 -- common/autobuild_common.sh@203 -- $ cat 00:03:14.805 08:21:07 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:14.805 00:03:14.805 real 0m25.764s 00:03:14.805 user 6m34.769s 00:03:14.805 sys 2m12.978s 00:03:14.805 08:21:07 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:14.805 08:21:07 -- common/autotest_common.sh@10 -- $ set +x 00:03:14.805 ************************************ 00:03:14.805 END TEST build_native_dpdk 00:03:14.805 ************************************ 00:03:14.805 08:21:07 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:03:14.805 08:21:07 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:03:14.805 08:21:07 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:03:14.805 08:21:07 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:03:14.805 08:21:07 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:03:14.805 08:21:07 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:03:14.805 08:21:07 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:03:14.805 08:21:07 -- common/autotest_common.sh@10 -- $ set +x 00:03:14.805 ************************************ 00:03:14.805 START TEST autobuild_llvm_precompile 00:03:14.805 ************************************ 00:03:14.805 08:21:07 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:03:14.805 08:21:07 -- common/autobuild_common.sh@32 -- $ clang --version 00:03:14.805 08:21:07 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:03:14.805 Target: x86_64-redhat-linux-gnu 00:03:14.805 Thread model: posix 00:03:14.805 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:03:14.805 08:21:07 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:03:14.805 08:21:07 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:03:14.805 08:21:07 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:03:14.805 08:21:07 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:03:14.805 08:21:07 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:03:14.805 08:21:07 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:03:14.805 08:21:07 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:14.805 08:21:07 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:03:14.805 08:21:07 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:03:14.805 08:21:07 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:15.065 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:15.323 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:15.323 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:15.323 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:15.891 Using 'verbs' RDMA provider 00:03:31.353 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:03:43.567 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:03:43.567 Creating mk/config.mk...done. 00:03:43.567 Creating mk/cc.flags.mk...done. 00:03:43.567 Type 'make' to build. 00:03:43.567 00:03:43.567 real 0m28.573s 00:03:43.567 user 0m12.618s 00:03:43.567 sys 0m15.334s 00:03:43.567 08:21:35 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:43.567 08:21:35 -- common/autotest_common.sh@10 -- $ set +x 00:03:43.567 ************************************ 00:03:43.567 END TEST autobuild_llvm_precompile 00:03:43.567 ************************************ 00:03:43.567 08:21:36 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:43.567 08:21:36 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:43.567 08:21:36 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:43.567 08:21:36 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:03:43.567 08:21:36 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:43.826 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:43.826 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:43.826 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:44.085 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:44.344 Using 'verbs' RDMA provider 00:03:57.127 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:04:07.201 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:04:07.460 Creating mk/config.mk...done. 00:04:07.460 Creating mk/cc.flags.mk...done. 00:04:07.460 Type 'make' to build. 00:04:07.460 08:22:00 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:04:07.460 08:22:00 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:04:07.460 08:22:00 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:04:07.460 08:22:00 -- common/autotest_common.sh@10 -- $ set +x 00:04:07.460 ************************************ 00:04:07.460 START TEST make 00:04:07.460 ************************************ 00:04:07.460 08:22:00 -- common/autotest_common.sh@1104 -- $ make -j112 00:04:07.719 make[1]: Nothing to be done for 'all'. 00:04:09.627 The Meson build system 00:04:09.627 Version: 1.5.0 00:04:09.627 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:04:09.627 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:04:09.627 Build type: native build 00:04:09.627 Project name: libvfio-user 00:04:09.627 Project version: 0.0.1 00:04:09.627 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:04:09.627 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:04:09.627 Host machine cpu family: x86_64 00:04:09.627 Host machine cpu: x86_64 00:04:09.627 Run-time dependency threads found: YES 00:04:09.627 Library dl found: YES 00:04:09.627 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:04:09.627 Run-time dependency json-c found: YES 0.17 00:04:09.627 Run-time dependency cmocka found: YES 1.1.7 00:04:09.627 Program pytest-3 found: NO 00:04:09.627 Program flake8 found: NO 00:04:09.627 Program misspell-fixer found: NO 00:04:09.627 Program restructuredtext-lint found: NO 00:04:09.627 Program valgrind found: YES (/usr/bin/valgrind) 00:04:09.627 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:04:09.627 Compiler for C supports arguments -Wmissing-declarations: YES 00:04:09.627 Compiler for C supports arguments -Wwrite-strings: YES 00:04:09.627 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:04:09.627 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:04:09.627 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:04:09.627 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:04:09.627 Build targets in project: 8 00:04:09.627 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:04:09.627 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:04:09.627 00:04:09.627 libvfio-user 0.0.1 00:04:09.627 00:04:09.627 User defined options 00:04:09.627 buildtype : debug 00:04:09.627 default_library: static 00:04:09.627 libdir : /usr/local/lib 00:04:09.627 00:04:09.627 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:04:09.627 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:04:09.886 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:04:09.886 [2/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:04:09.886 [3/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:04:09.886 [4/36] Compiling C object samples/null.p/null.c.o 00:04:09.886 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:04:09.886 [6/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:04:09.886 [7/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:04:09.886 [8/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:04:09.886 [9/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:04:09.886 [10/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:04:09.886 [11/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:04:09.886 [12/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:04:09.886 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:04:09.886 [14/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:04:09.886 [15/36] Compiling C object test/unit_tests.p/mocks.c.o 00:04:09.886 [16/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:04:09.886 [17/36] Compiling C object samples/server.p/server.c.o 00:04:09.886 [18/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:04:09.886 [19/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:04:09.886 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:04:09.886 [21/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:04:09.886 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:04:09.886 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:04:09.886 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:04:09.886 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:04:09.886 [26/36] Compiling C object samples/client.p/client.c.o 00:04:09.886 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:04:09.886 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:04:09.886 [29/36] Linking static target lib/libvfio-user.a 00:04:09.886 [30/36] Linking target samples/client 00:04:09.886 [31/36] Linking target test/unit_tests 00:04:09.886 [32/36] Linking target samples/gpio-pci-idio-16 00:04:09.886 [33/36] Linking target samples/server 00:04:09.886 [34/36] Linking target samples/lspci 00:04:09.886 [35/36] Linking target samples/null 00:04:09.886 [36/36] Linking target samples/shadow_ioeventfd_server 00:04:09.886 INFO: autodetecting backend as ninja 00:04:09.886 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:04:10.146 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:04:10.405 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:04:10.405 ninja: no work to do. 00:04:13.694 CC lib/ut/ut.o 00:04:13.694 CC lib/log/log.o 00:04:13.694 CC lib/log/log_flags.o 00:04:13.694 CC lib/log/log_deprecated.o 00:04:13.694 CC lib/ut_mock/mock.o 00:04:13.694 LIB libspdk_ut.a 00:04:13.694 LIB libspdk_ut_mock.a 00:04:13.694 LIB libspdk_log.a 00:04:13.954 CC lib/dma/dma.o 00:04:13.954 CXX lib/trace_parser/trace.o 00:04:13.954 CC lib/util/cpuset.o 00:04:13.954 CC lib/util/base64.o 00:04:13.954 CC lib/ioat/ioat.o 00:04:13.954 CC lib/util/bit_array.o 00:04:13.954 CC lib/util/crc32.o 00:04:13.954 CC lib/util/crc16.o 00:04:13.954 CC lib/util/crc32c.o 00:04:13.954 CC lib/util/dif.o 00:04:13.954 CC lib/util/crc32_ieee.o 00:04:13.954 CC lib/util/crc64.o 00:04:13.954 CC lib/util/fd.o 00:04:13.954 CC lib/util/file.o 00:04:13.954 CC lib/util/hexlify.o 00:04:13.954 CC lib/util/iov.o 00:04:13.954 CC lib/util/math.o 00:04:13.954 CC lib/util/pipe.o 00:04:13.954 CC lib/util/strerror_tls.o 00:04:13.954 CC lib/util/uuid.o 00:04:13.954 CC lib/util/string.o 00:04:13.954 CC lib/util/fd_group.o 00:04:13.954 CC lib/util/xor.o 00:04:13.954 CC lib/util/zipf.o 00:04:13.954 LIB libspdk_dma.a 00:04:13.954 CC lib/vfio_user/host/vfio_user_pci.o 00:04:13.954 CC lib/vfio_user/host/vfio_user.o 00:04:14.213 LIB libspdk_ioat.a 00:04:14.213 LIB libspdk_vfio_user.a 00:04:14.213 LIB libspdk_util.a 00:04:14.472 LIB libspdk_trace_parser.a 00:04:14.472 CC lib/conf/conf.o 00:04:14.472 CC lib/env_dpdk/init.o 00:04:14.472 CC lib/env_dpdk/env.o 00:04:14.472 CC lib/env_dpdk/memory.o 00:04:14.472 CC lib/env_dpdk/pci.o 00:04:14.472 CC lib/env_dpdk/threads.o 00:04:14.472 CC lib/env_dpdk/sigbus_handler.o 00:04:14.472 CC lib/env_dpdk/pci_ioat.o 00:04:14.472 CC lib/env_dpdk/pci_event.o 00:04:14.472 CC lib/env_dpdk/pci_virtio.o 00:04:14.472 CC lib/env_dpdk/pci_vmd.o 00:04:14.472 CC lib/env_dpdk/pci_idxd.o 00:04:14.472 CC lib/env_dpdk/pci_dpdk_2211.o 00:04:14.472 CC lib/env_dpdk/pci_dpdk.o 00:04:14.472 CC lib/env_dpdk/pci_dpdk_2207.o 00:04:14.472 CC lib/vmd/vmd.o 00:04:14.472 CC lib/vmd/led.o 00:04:14.472 CC lib/rdma/common.o 00:04:14.472 CC lib/rdma/rdma_verbs.o 00:04:14.472 CC lib/json/json_parse.o 00:04:14.472 CC lib/json/json_util.o 00:04:14.472 CC lib/idxd/idxd.o 00:04:14.472 CC lib/json/json_write.o 00:04:14.472 CC lib/idxd/idxd_user.o 00:04:14.472 CC lib/idxd/idxd_kernel.o 00:04:14.731 LIB libspdk_conf.a 00:04:14.731 LIB libspdk_rdma.a 00:04:14.731 LIB libspdk_json.a 00:04:14.731 LIB libspdk_idxd.a 00:04:14.989 LIB libspdk_vmd.a 00:04:14.989 CC lib/jsonrpc/jsonrpc_server.o 00:04:14.989 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:04:14.989 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:04:14.989 CC lib/jsonrpc/jsonrpc_client.o 00:04:15.248 LIB libspdk_jsonrpc.a 00:04:15.516 LIB libspdk_env_dpdk.a 00:04:15.516 CC lib/rpc/rpc.o 00:04:15.516 LIB libspdk_rpc.a 00:04:15.775 CC lib/notify/notify.o 00:04:16.034 CC lib/notify/notify_rpc.o 00:04:16.034 CC lib/trace/trace.o 00:04:16.034 CC lib/trace/trace_flags.o 00:04:16.034 CC lib/trace/trace_rpc.o 00:04:16.034 CC lib/sock/sock.o 00:04:16.034 CC lib/sock/sock_rpc.o 00:04:16.034 LIB libspdk_notify.a 00:04:16.034 LIB libspdk_trace.a 00:04:16.292 LIB libspdk_sock.a 00:04:16.292 CC lib/thread/thread.o 00:04:16.292 CC lib/thread/iobuf.o 00:04:16.551 CC lib/nvme/nvme_ctrlr_cmd.o 00:04:16.551 CC lib/nvme/nvme_ctrlr.o 00:04:16.552 CC lib/nvme/nvme_fabric.o 00:04:16.552 CC lib/nvme/nvme_ns_cmd.o 00:04:16.552 CC lib/nvme/nvme_ns.o 00:04:16.552 CC lib/nvme/nvme_pcie_common.o 00:04:16.552 CC lib/nvme/nvme_pcie.o 00:04:16.552 CC lib/nvme/nvme_qpair.o 00:04:16.552 CC lib/nvme/nvme.o 00:04:16.552 CC lib/nvme/nvme_quirks.o 00:04:16.552 CC lib/nvme/nvme_transport.o 00:04:16.552 CC lib/nvme/nvme_discovery.o 00:04:16.552 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:04:16.552 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:04:16.552 CC lib/nvme/nvme_tcp.o 00:04:16.552 CC lib/nvme/nvme_opal.o 00:04:16.552 CC lib/nvme/nvme_zns.o 00:04:16.552 CC lib/nvme/nvme_io_msg.o 00:04:16.552 CC lib/nvme/nvme_poll_group.o 00:04:16.552 CC lib/nvme/nvme_cuse.o 00:04:16.552 CC lib/nvme/nvme_vfio_user.o 00:04:16.552 CC lib/nvme/nvme_rdma.o 00:04:17.120 LIB libspdk_thread.a 00:04:17.379 CC lib/blob/blobstore.o 00:04:17.379 CC lib/blob/blob_bs_dev.o 00:04:17.379 CC lib/blob/request.o 00:04:17.379 CC lib/blob/zeroes.o 00:04:17.379 CC lib/vfu_tgt/tgt_endpoint.o 00:04:17.379 CC lib/vfu_tgt/tgt_rpc.o 00:04:17.379 CC lib/virtio/virtio.o 00:04:17.379 CC lib/init/json_config.o 00:04:17.379 CC lib/virtio/virtio_vhost_user.o 00:04:17.379 CC lib/init/subsystem.o 00:04:17.379 CC lib/virtio/virtio_vfio_user.o 00:04:17.379 CC lib/init/subsystem_rpc.o 00:04:17.379 CC lib/virtio/virtio_pci.o 00:04:17.379 CC lib/init/rpc.o 00:04:17.379 CC lib/accel/accel_sw.o 00:04:17.379 CC lib/accel/accel.o 00:04:17.379 CC lib/accel/accel_rpc.o 00:04:17.637 LIB libspdk_init.a 00:04:17.637 LIB libspdk_vfu_tgt.a 00:04:17.637 LIB libspdk_virtio.a 00:04:17.637 LIB libspdk_nvme.a 00:04:17.897 CC lib/event/app.o 00:04:17.897 CC lib/event/log_rpc.o 00:04:17.897 CC lib/event/reactor.o 00:04:17.897 CC lib/event/app_rpc.o 00:04:17.897 CC lib/event/scheduler_static.o 00:04:18.157 LIB libspdk_accel.a 00:04:18.157 LIB libspdk_event.a 00:04:18.416 CC lib/bdev/bdev.o 00:04:18.416 CC lib/bdev/bdev_rpc.o 00:04:18.416 CC lib/bdev/bdev_zone.o 00:04:18.416 CC lib/bdev/scsi_nvme.o 00:04:18.416 CC lib/bdev/part.o 00:04:18.985 LIB libspdk_blob.a 00:04:19.244 CC lib/lvol/lvol.o 00:04:19.244 CC lib/blobfs/blobfs.o 00:04:19.244 CC lib/blobfs/tree.o 00:04:19.811 LIB libspdk_lvol.a 00:04:19.811 LIB libspdk_blobfs.a 00:04:20.070 LIB libspdk_bdev.a 00:04:20.328 CC lib/nbd/nbd.o 00:04:20.328 CC lib/nbd/nbd_rpc.o 00:04:20.328 CC lib/ublk/ublk.o 00:04:20.328 CC lib/ublk/ublk_rpc.o 00:04:20.328 CC lib/ftl/ftl_core.o 00:04:20.328 CC lib/ftl/ftl_init.o 00:04:20.328 CC lib/ftl/ftl_layout.o 00:04:20.328 CC lib/ftl/ftl_sb.o 00:04:20.328 CC lib/ftl/ftl_debug.o 00:04:20.328 CC lib/ftl/ftl_io.o 00:04:20.328 CC lib/ftl/ftl_l2p.o 00:04:20.328 CC lib/ftl/ftl_l2p_flat.o 00:04:20.328 CC lib/nvmf/ctrlr.o 00:04:20.328 CC lib/ftl/ftl_nv_cache.o 00:04:20.328 CC lib/ftl/ftl_writer.o 00:04:20.328 CC lib/nvmf/ctrlr_discovery.o 00:04:20.328 CC lib/ftl/ftl_band.o 00:04:20.328 CC lib/nvmf/subsystem.o 00:04:20.328 CC lib/nvmf/ctrlr_bdev.o 00:04:20.328 CC lib/ftl/ftl_band_ops.o 00:04:20.328 CC lib/ftl/ftl_l2p_cache.o 00:04:20.328 CC lib/ftl/ftl_rq.o 00:04:20.328 CC lib/ftl/ftl_reloc.o 00:04:20.328 CC lib/nvmf/nvmf.o 00:04:20.328 CC lib/scsi/dev.o 00:04:20.328 CC lib/nvmf/nvmf_rpc.o 00:04:20.328 CC lib/scsi/lun.o 00:04:20.328 CC lib/nvmf/transport.o 00:04:20.328 CC lib/scsi/port.o 00:04:20.328 CC lib/scsi/scsi.o 00:04:20.328 CC lib/ftl/ftl_p2l.o 00:04:20.328 CC lib/nvmf/tcp.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt.o 00:04:20.328 CC lib/scsi/scsi_bdev.o 00:04:20.328 CC lib/nvmf/vfio_user.o 00:04:20.328 CC lib/scsi/scsi_pr.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:04:20.328 CC lib/nvmf/rdma.o 00:04:20.328 CC lib/scsi/scsi_rpc.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_startup.o 00:04:20.328 CC lib/scsi/task.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_md.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_misc.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_band.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:04:20.328 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:04:20.328 CC lib/ftl/utils/ftl_conf.o 00:04:20.328 CC lib/ftl/utils/ftl_md.o 00:04:20.328 CC lib/ftl/utils/ftl_mempool.o 00:04:20.328 CC lib/ftl/utils/ftl_bitmap.o 00:04:20.328 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:04:20.328 CC lib/ftl/utils/ftl_property.o 00:04:20.328 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:04:20.328 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:04:20.328 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:04:20.328 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:04:20.328 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:04:20.328 CC lib/ftl/upgrade/ftl_sb_v3.o 00:04:20.328 CC lib/ftl/upgrade/ftl_sb_v5.o 00:04:20.328 CC lib/ftl/nvc/ftl_nvc_dev.o 00:04:20.328 CC lib/ftl/base/ftl_base_dev.o 00:04:20.328 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:04:20.328 CC lib/ftl/base/ftl_base_bdev.o 00:04:20.328 CC lib/ftl/ftl_trace.o 00:04:20.586 LIB libspdk_nbd.a 00:04:20.586 LIB libspdk_scsi.a 00:04:20.846 LIB libspdk_ublk.a 00:04:20.846 LIB libspdk_ftl.a 00:04:21.106 CC lib/vhost/vhost_rpc.o 00:04:21.106 CC lib/vhost/vhost.o 00:04:21.106 CC lib/vhost/rte_vhost_user.o 00:04:21.106 CC lib/vhost/vhost_scsi.o 00:04:21.106 CC lib/vhost/vhost_blk.o 00:04:21.106 CC lib/iscsi/conn.o 00:04:21.106 CC lib/iscsi/init_grp.o 00:04:21.106 CC lib/iscsi/iscsi.o 00:04:21.106 CC lib/iscsi/md5.o 00:04:21.106 CC lib/iscsi/param.o 00:04:21.106 CC lib/iscsi/tgt_node.o 00:04:21.106 CC lib/iscsi/portal_grp.o 00:04:21.106 CC lib/iscsi/iscsi_subsystem.o 00:04:21.106 CC lib/iscsi/iscsi_rpc.o 00:04:21.106 CC lib/iscsi/task.o 00:04:21.365 LIB libspdk_nvmf.a 00:04:21.625 LIB libspdk_vhost.a 00:04:21.625 LIB libspdk_iscsi.a 00:04:22.193 CC module/env_dpdk/env_dpdk_rpc.o 00:04:22.193 CC module/vfu_device/vfu_virtio.o 00:04:22.193 CC module/vfu_device/vfu_virtio_blk.o 00:04:22.193 CC module/vfu_device/vfu_virtio_scsi.o 00:04:22.193 CC module/vfu_device/vfu_virtio_rpc.o 00:04:22.193 CC module/sock/posix/posix.o 00:04:22.193 LIB libspdk_env_dpdk_rpc.a 00:04:22.193 CC module/accel/iaa/accel_iaa.o 00:04:22.193 CC module/accel/iaa/accel_iaa_rpc.o 00:04:22.193 CC module/accel/error/accel_error.o 00:04:22.193 CC module/accel/error/accel_error_rpc.o 00:04:22.452 CC module/accel/dsa/accel_dsa_rpc.o 00:04:22.452 CC module/accel/dsa/accel_dsa.o 00:04:22.452 CC module/accel/ioat/accel_ioat.o 00:04:22.452 CC module/accel/ioat/accel_ioat_rpc.o 00:04:22.452 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:22.452 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:22.452 CC module/blob/bdev/blob_bdev.o 00:04:22.452 CC module/scheduler/gscheduler/gscheduler.o 00:04:22.452 LIB libspdk_accel_iaa.a 00:04:22.452 LIB libspdk_scheduler_dpdk_governor.a 00:04:22.452 LIB libspdk_accel_error.a 00:04:22.452 LIB libspdk_accel_ioat.a 00:04:22.452 LIB libspdk_scheduler_dynamic.a 00:04:22.452 LIB libspdk_scheduler_gscheduler.a 00:04:22.452 LIB libspdk_accel_dsa.a 00:04:22.452 LIB libspdk_blob_bdev.a 00:04:22.711 LIB libspdk_vfu_device.a 00:04:22.711 LIB libspdk_sock_posix.a 00:04:22.971 CC module/bdev/delay/vbdev_delay.o 00:04:22.971 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:22.971 CC module/bdev/error/vbdev_error.o 00:04:22.971 CC module/bdev/error/vbdev_error_rpc.o 00:04:22.971 CC module/bdev/raid/bdev_raid_rpc.o 00:04:22.971 CC module/bdev/raid/bdev_raid.o 00:04:22.971 CC module/bdev/raid/raid0.o 00:04:22.971 CC module/bdev/null/bdev_null.o 00:04:22.971 CC module/bdev/raid/bdev_raid_sb.o 00:04:22.971 CC module/bdev/raid/concat.o 00:04:22.971 CC module/bdev/split/vbdev_split.o 00:04:22.971 CC module/bdev/null/bdev_null_rpc.o 00:04:22.971 CC module/blobfs/bdev/blobfs_bdev.o 00:04:22.971 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:22.971 CC module/bdev/raid/raid1.o 00:04:22.971 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:22.971 CC module/bdev/nvme/bdev_nvme.o 00:04:22.971 CC module/bdev/split/vbdev_split_rpc.o 00:04:22.971 CC module/bdev/lvol/vbdev_lvol.o 00:04:22.971 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:22.971 CC module/bdev/passthru/vbdev_passthru.o 00:04:22.971 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:22.971 CC module/bdev/ftl/bdev_ftl.o 00:04:22.971 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:22.971 CC module/bdev/nvme/nvme_rpc.o 00:04:22.971 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:22.971 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:22.971 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:22.971 CC module/bdev/nvme/bdev_mdns_client.o 00:04:22.971 CC module/bdev/nvme/vbdev_opal.o 00:04:22.971 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:22.971 CC module/bdev/iscsi/bdev_iscsi.o 00:04:22.971 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:22.971 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:22.971 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:22.971 CC module/bdev/gpt/vbdev_gpt.o 00:04:22.971 CC module/bdev/gpt/gpt.o 00:04:22.971 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:22.971 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:22.971 CC module/bdev/malloc/bdev_malloc.o 00:04:22.971 CC module/bdev/aio/bdev_aio.o 00:04:22.971 CC module/bdev/aio/bdev_aio_rpc.o 00:04:22.971 LIB libspdk_blobfs_bdev.a 00:04:23.231 LIB libspdk_bdev_error.a 00:04:23.231 LIB libspdk_bdev_split.a 00:04:23.231 LIB libspdk_bdev_null.a 00:04:23.231 LIB libspdk_bdev_gpt.a 00:04:23.231 LIB libspdk_bdev_ftl.a 00:04:23.231 LIB libspdk_bdev_passthru.a 00:04:23.231 LIB libspdk_bdev_delay.a 00:04:23.231 LIB libspdk_bdev_zone_block.a 00:04:23.231 LIB libspdk_bdev_iscsi.a 00:04:23.231 LIB libspdk_bdev_aio.a 00:04:23.231 LIB libspdk_bdev_malloc.a 00:04:23.231 LIB libspdk_bdev_lvol.a 00:04:23.231 LIB libspdk_bdev_virtio.a 00:04:23.490 LIB libspdk_bdev_raid.a 00:04:24.060 LIB libspdk_bdev_nvme.a 00:04:24.629 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:24.629 CC module/event/subsystems/sock/sock.o 00:04:24.629 CC module/event/subsystems/vmd/vmd.o 00:04:24.629 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:24.629 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:04:24.629 CC module/event/subsystems/iobuf/iobuf.o 00:04:24.629 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:24.629 CC module/event/subsystems/scheduler/scheduler.o 00:04:24.629 LIB libspdk_event_vhost_blk.a 00:04:24.889 LIB libspdk_event_sock.a 00:04:24.889 LIB libspdk_event_vmd.a 00:04:24.889 LIB libspdk_event_vfu_tgt.a 00:04:24.889 LIB libspdk_event_scheduler.a 00:04:24.889 LIB libspdk_event_iobuf.a 00:04:25.149 CC module/event/subsystems/accel/accel.o 00:04:25.149 LIB libspdk_event_accel.a 00:04:25.408 CC module/event/subsystems/bdev/bdev.o 00:04:25.667 LIB libspdk_event_bdev.a 00:04:25.926 CC module/event/subsystems/ublk/ublk.o 00:04:25.926 CC module/event/subsystems/scsi/scsi.o 00:04:25.926 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:25.926 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:25.926 CC module/event/subsystems/nbd/nbd.o 00:04:25.926 LIB libspdk_event_ublk.a 00:04:25.926 LIB libspdk_event_scsi.a 00:04:25.926 LIB libspdk_event_nbd.a 00:04:26.185 LIB libspdk_event_nvmf.a 00:04:26.445 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:26.445 CC module/event/subsystems/iscsi/iscsi.o 00:04:26.445 LIB libspdk_event_vhost_scsi.a 00:04:26.445 LIB libspdk_event_iscsi.a 00:04:26.706 CC app/spdk_nvme_discover/discovery_aer.o 00:04:26.706 CC app/trace_record/trace_record.o 00:04:26.706 CC app/spdk_nvme_identify/identify.o 00:04:26.706 CC app/spdk_nvme_perf/perf.o 00:04:26.706 CXX app/trace/trace.o 00:04:26.706 CC app/spdk_top/spdk_top.o 00:04:26.706 CC app/spdk_lspci/spdk_lspci.o 00:04:26.706 CC app/spdk_dd/spdk_dd.o 00:04:26.706 TEST_HEADER include/spdk/accel.h 00:04:26.706 CC test/rpc_client/rpc_client_test.o 00:04:26.706 TEST_HEADER include/spdk/barrier.h 00:04:26.706 TEST_HEADER include/spdk/accel_module.h 00:04:26.706 TEST_HEADER include/spdk/assert.h 00:04:26.706 TEST_HEADER include/spdk/bdev_module.h 00:04:26.706 TEST_HEADER include/spdk/bdev.h 00:04:26.706 TEST_HEADER include/spdk/base64.h 00:04:26.706 TEST_HEADER include/spdk/bdev_zone.h 00:04:26.706 TEST_HEADER include/spdk/bit_pool.h 00:04:26.706 TEST_HEADER include/spdk/bit_array.h 00:04:26.706 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:26.706 TEST_HEADER include/spdk/blob_bdev.h 00:04:26.706 CC app/nvmf_tgt/nvmf_main.o 00:04:26.706 TEST_HEADER include/spdk/blobfs.h 00:04:26.706 TEST_HEADER include/spdk/blob.h 00:04:26.706 TEST_HEADER include/spdk/conf.h 00:04:26.706 TEST_HEADER include/spdk/cpuset.h 00:04:26.706 TEST_HEADER include/spdk/config.h 00:04:26.706 TEST_HEADER include/spdk/crc16.h 00:04:26.706 TEST_HEADER include/spdk/crc32.h 00:04:26.706 TEST_HEADER include/spdk/crc64.h 00:04:26.706 TEST_HEADER include/spdk/dif.h 00:04:26.706 TEST_HEADER include/spdk/dma.h 00:04:26.706 TEST_HEADER include/spdk/endian.h 00:04:26.706 TEST_HEADER include/spdk/env_dpdk.h 00:04:26.706 TEST_HEADER include/spdk/env.h 00:04:26.706 TEST_HEADER include/spdk/event.h 00:04:26.706 TEST_HEADER include/spdk/fd_group.h 00:04:26.706 TEST_HEADER include/spdk/file.h 00:04:26.706 TEST_HEADER include/spdk/fd.h 00:04:26.706 TEST_HEADER include/spdk/ftl.h 00:04:26.706 TEST_HEADER include/spdk/gpt_spec.h 00:04:26.706 TEST_HEADER include/spdk/histogram_data.h 00:04:26.706 TEST_HEADER include/spdk/hexlify.h 00:04:26.706 TEST_HEADER include/spdk/idxd.h 00:04:26.706 TEST_HEADER include/spdk/idxd_spec.h 00:04:26.706 TEST_HEADER include/spdk/init.h 00:04:26.706 TEST_HEADER include/spdk/ioat_spec.h 00:04:26.706 TEST_HEADER include/spdk/ioat.h 00:04:26.706 TEST_HEADER include/spdk/iscsi_spec.h 00:04:26.706 TEST_HEADER include/spdk/json.h 00:04:26.706 TEST_HEADER include/spdk/jsonrpc.h 00:04:26.706 TEST_HEADER include/spdk/likely.h 00:04:26.706 TEST_HEADER include/spdk/log.h 00:04:26.706 TEST_HEADER include/spdk/lvol.h 00:04:26.706 CC app/spdk_tgt/spdk_tgt.o 00:04:26.706 TEST_HEADER include/spdk/mmio.h 00:04:26.706 TEST_HEADER include/spdk/memory.h 00:04:26.706 TEST_HEADER include/spdk/nbd.h 00:04:26.706 TEST_HEADER include/spdk/notify.h 00:04:26.706 TEST_HEADER include/spdk/nvme.h 00:04:26.706 TEST_HEADER include/spdk/nvme_intel.h 00:04:26.706 CC app/vhost/vhost.o 00:04:26.706 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:26.969 CC app/iscsi_tgt/iscsi_tgt.o 00:04:26.969 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:26.969 TEST_HEADER include/spdk/nvme_spec.h 00:04:26.969 TEST_HEADER include/spdk/nvme_zns.h 00:04:26.969 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:26.969 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:26.969 TEST_HEADER include/spdk/nvmf.h 00:04:26.969 TEST_HEADER include/spdk/nvmf_spec.h 00:04:26.969 TEST_HEADER include/spdk/nvmf_transport.h 00:04:26.969 TEST_HEADER include/spdk/opal.h 00:04:26.969 TEST_HEADER include/spdk/opal_spec.h 00:04:26.969 TEST_HEADER include/spdk/pci_ids.h 00:04:26.969 TEST_HEADER include/spdk/pipe.h 00:04:26.970 TEST_HEADER include/spdk/queue.h 00:04:26.970 TEST_HEADER include/spdk/reduce.h 00:04:26.970 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:26.970 TEST_HEADER include/spdk/rpc.h 00:04:26.970 TEST_HEADER include/spdk/scsi.h 00:04:26.970 TEST_HEADER include/spdk/scheduler.h 00:04:26.970 TEST_HEADER include/spdk/scsi_spec.h 00:04:26.970 TEST_HEADER include/spdk/stdinc.h 00:04:26.970 TEST_HEADER include/spdk/string.h 00:04:26.970 TEST_HEADER include/spdk/sock.h 00:04:26.970 TEST_HEADER include/spdk/thread.h 00:04:26.970 TEST_HEADER include/spdk/trace.h 00:04:26.970 TEST_HEADER include/spdk/tree.h 00:04:26.970 TEST_HEADER include/spdk/trace_parser.h 00:04:26.970 TEST_HEADER include/spdk/util.h 00:04:26.970 TEST_HEADER include/spdk/ublk.h 00:04:26.970 TEST_HEADER include/spdk/uuid.h 00:04:26.970 TEST_HEADER include/spdk/version.h 00:04:26.970 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:26.970 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:26.970 TEST_HEADER include/spdk/vhost.h 00:04:26.970 TEST_HEADER include/spdk/vmd.h 00:04:26.970 TEST_HEADER include/spdk/xor.h 00:04:26.970 TEST_HEADER include/spdk/zipf.h 00:04:26.970 CXX test/cpp_headers/accel_module.o 00:04:26.970 CXX test/cpp_headers/accel.o 00:04:26.970 CXX test/cpp_headers/assert.o 00:04:26.970 CXX test/cpp_headers/barrier.o 00:04:26.970 CXX test/cpp_headers/base64.o 00:04:26.970 CXX test/cpp_headers/bdev_module.o 00:04:26.970 CXX test/cpp_headers/bdev.o 00:04:26.970 CXX test/cpp_headers/bdev_zone.o 00:04:26.970 CXX test/cpp_headers/bit_array.o 00:04:26.970 CXX test/cpp_headers/bit_pool.o 00:04:26.970 CXX test/cpp_headers/blob_bdev.o 00:04:26.970 CXX test/cpp_headers/blobfs_bdev.o 00:04:26.970 CXX test/cpp_headers/blobfs.o 00:04:26.970 CXX test/cpp_headers/blob.o 00:04:26.970 CXX test/cpp_headers/config.o 00:04:26.970 CXX test/cpp_headers/conf.o 00:04:26.970 CXX test/cpp_headers/cpuset.o 00:04:26.970 CXX test/cpp_headers/crc32.o 00:04:26.970 CXX test/cpp_headers/crc16.o 00:04:26.970 CXX test/cpp_headers/crc64.o 00:04:26.970 CXX test/cpp_headers/dif.o 00:04:26.970 CXX test/cpp_headers/dma.o 00:04:26.970 CXX test/cpp_headers/endian.o 00:04:26.970 CC test/thread/lock/spdk_lock.o 00:04:26.970 CXX test/cpp_headers/env_dpdk.o 00:04:26.970 CXX test/cpp_headers/env.o 00:04:26.970 CXX test/cpp_headers/event.o 00:04:26.970 CC app/fio/nvme/fio_plugin.o 00:04:26.970 CXX test/cpp_headers/fd_group.o 00:04:26.970 CXX test/cpp_headers/fd.o 00:04:26.970 CXX test/cpp_headers/file.o 00:04:26.970 CXX test/cpp_headers/ftl.o 00:04:26.970 CC test/thread/poller_perf/poller_perf.o 00:04:26.970 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:26.970 CXX test/cpp_headers/gpt_spec.o 00:04:26.970 CXX test/cpp_headers/hexlify.o 00:04:26.970 CXX test/cpp_headers/histogram_data.o 00:04:26.970 CC test/event/reactor_perf/reactor_perf.o 00:04:26.970 CXX test/cpp_headers/idxd.o 00:04:26.970 CXX test/cpp_headers/idxd_spec.o 00:04:26.970 CXX test/cpp_headers/init.o 00:04:26.970 CC examples/nvme/reconnect/reconnect.o 00:04:26.970 CC test/nvme/sgl/sgl.o 00:04:26.970 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:26.970 CC examples/nvme/hello_world/hello_world.o 00:04:26.970 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:26.970 CC test/nvme/overhead/overhead.o 00:04:26.970 CC test/nvme/aer/aer.o 00:04:26.970 CC test/event/event_perf/event_perf.o 00:04:26.970 CC examples/nvme/arbitration/arbitration.o 00:04:26.970 CC test/nvme/connect_stress/connect_stress.o 00:04:26.970 CC test/nvme/err_injection/err_injection.o 00:04:26.970 CC examples/accel/perf/accel_perf.o 00:04:26.970 CC test/app/histogram_perf/histogram_perf.o 00:04:26.970 CC test/nvme/simple_copy/simple_copy.o 00:04:26.970 CC test/nvme/reserve/reserve.o 00:04:26.970 CC examples/nvme/hotplug/hotplug.o 00:04:26.970 CC examples/nvme/abort/abort.o 00:04:26.970 CC test/nvme/reset/reset.o 00:04:26.970 CC examples/util/zipf/zipf.o 00:04:26.970 CC test/nvme/fused_ordering/fused_ordering.o 00:04:26.970 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:26.970 CC test/nvme/e2edp/nvme_dp.o 00:04:26.970 CC test/nvme/compliance/nvme_compliance.o 00:04:26.970 CC test/nvme/startup/startup.o 00:04:26.970 CC test/nvme/fdp/fdp.o 00:04:26.970 CC test/event/reactor/reactor.o 00:04:26.970 CC test/app/jsoncat/jsoncat.o 00:04:26.970 CC examples/ioat/perf/perf.o 00:04:26.970 CC examples/vmd/lsvmd/lsvmd.o 00:04:26.970 CC test/env/memory/memory_ut.o 00:04:26.970 CC test/nvme/boot_partition/boot_partition.o 00:04:26.970 CC test/app/stub/stub.o 00:04:26.970 CC test/nvme/cuse/cuse.o 00:04:26.970 CC test/env/pci/pci_ut.o 00:04:26.970 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:26.970 CC examples/vmd/led/led.o 00:04:26.970 CC app/fio/bdev/fio_plugin.o 00:04:26.970 CC examples/ioat/verify/verify.o 00:04:26.970 CC examples/idxd/perf/perf.o 00:04:26.970 CC examples/sock/hello_world/hello_sock.o 00:04:26.970 CC test/blobfs/mkfs/mkfs.o 00:04:26.970 LINK spdk_lspci 00:04:26.970 CC examples/nvmf/nvmf/nvmf.o 00:04:26.970 CC test/env/vtophys/vtophys.o 00:04:26.970 CC test/event/app_repeat/app_repeat.o 00:04:26.970 CC examples/thread/thread/thread_ex.o 00:04:26.970 CC test/accel/dif/dif.o 00:04:26.970 CC examples/blob/cli/blobcli.o 00:04:26.970 CC test/dma/test_dma/test_dma.o 00:04:26.970 CC examples/blob/hello_world/hello_blob.o 00:04:26.970 LINK spdk_nvme_discover 00:04:26.970 CC test/app/bdev_svc/bdev_svc.o 00:04:26.970 CC examples/bdev/hello_world/hello_bdev.o 00:04:26.970 CC test/event/scheduler/scheduler.o 00:04:26.970 CC examples/bdev/bdevperf/bdevperf.o 00:04:26.970 LINK rpc_client_test 00:04:26.970 CC test/bdev/bdevio/bdevio.o 00:04:26.970 CC test/lvol/esnap/esnap.o 00:04:26.970 LINK spdk_trace_record 00:04:26.970 CC test/env/mem_callbacks/mem_callbacks.o 00:04:26.970 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:26.970 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:26.970 LINK nvmf_tgt 00:04:26.970 LINK interrupt_tgt 00:04:27.232 LINK reactor_perf 00:04:27.232 LINK poller_perf 00:04:27.232 LINK event_perf 00:04:27.232 LINK vhost 00:04:27.232 CXX test/cpp_headers/ioat.o 00:04:27.232 LINK jsoncat 00:04:27.232 CXX test/cpp_headers/ioat_spec.o 00:04:27.232 LINK spdk_tgt 00:04:27.232 CXX test/cpp_headers/iscsi_spec.o 00:04:27.232 LINK lsvmd 00:04:27.232 LINK zipf 00:04:27.232 CXX test/cpp_headers/json.o 00:04:27.232 CXX test/cpp_headers/jsonrpc.o 00:04:27.232 LINK histogram_perf 00:04:27.232 CXX test/cpp_headers/likely.o 00:04:27.232 LINK pmr_persistence 00:04:27.232 CXX test/cpp_headers/log.o 00:04:27.232 LINK reactor 00:04:27.232 CXX test/cpp_headers/lvol.o 00:04:27.232 LINK led 00:04:27.232 CXX test/cpp_headers/memory.o 00:04:27.232 CXX test/cpp_headers/mmio.o 00:04:27.232 LINK iscsi_tgt 00:04:27.232 CXX test/cpp_headers/nbd.o 00:04:27.232 CXX test/cpp_headers/notify.o 00:04:27.232 LINK connect_stress 00:04:27.232 CXX test/cpp_headers/nvme.o 00:04:27.232 CXX test/cpp_headers/nvme_intel.o 00:04:27.232 CXX test/cpp_headers/nvme_ocssd.o 00:04:27.232 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:27.232 CXX test/cpp_headers/nvme_spec.o 00:04:27.232 CXX test/cpp_headers/nvme_zns.o 00:04:27.232 CXX test/cpp_headers/nvmf_cmd.o 00:04:27.232 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:27.232 CXX test/cpp_headers/nvmf.o 00:04:27.232 CXX test/cpp_headers/nvmf_spec.o 00:04:27.232 CXX test/cpp_headers/nvmf_transport.o 00:04:27.232 CXX test/cpp_headers/opal.o 00:04:27.232 CXX test/cpp_headers/opal_spec.o 00:04:27.232 CXX test/cpp_headers/pci_ids.o 00:04:27.232 CXX test/cpp_headers/pipe.o 00:04:27.232 CXX test/cpp_headers/queue.o 00:04:27.232 LINK env_dpdk_post_init 00:04:27.232 CXX test/cpp_headers/reduce.o 00:04:27.232 LINK app_repeat 00:04:27.232 LINK startup 00:04:27.232 LINK boot_partition 00:04:27.232 LINK vtophys 00:04:27.232 LINK err_injection 00:04:27.232 CXX test/cpp_headers/rpc.o 00:04:27.232 CXX test/cpp_headers/scheduler.o 00:04:27.232 LINK cmb_copy 00:04:27.232 CXX test/cpp_headers/scsi.o 00:04:27.232 LINK reserve 00:04:27.232 LINK doorbell_aers 00:04:27.232 LINK stub 00:04:27.232 CXX test/cpp_headers/scsi_spec.o 00:04:27.232 LINK fused_ordering 00:04:27.232 LINK mkfs 00:04:27.232 LINK hotplug 00:04:27.232 LINK hello_world 00:04:27.232 CXX test/cpp_headers/sock.o 00:04:27.232 LINK bdev_svc 00:04:27.232 LINK ioat_perf 00:04:27.232 LINK hello_sock 00:04:27.232 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:27.232 LINK verify 00:04:27.232 LINK simple_copy 00:04:27.232 LINK aer 00:04:27.232 LINK spdk_trace 00:04:27.232 LINK sgl 00:04:27.232 LINK nvme_dp 00:04:27.232 LINK overhead 00:04:27.232 LINK thread 00:04:27.232 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:04:27.232 LINK scheduler 00:04:27.232 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:04:27.232 LINK hello_bdev 00:04:27.232 LINK hello_blob 00:04:27.232 LINK reset 00:04:27.232 LINK fdp 00:04:27.232 LINK mem_callbacks 00:04:27.232 LINK nvmf 00:04:27.232 CXX test/cpp_headers/stdinc.o 00:04:27.232 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:27.232 CXX test/cpp_headers/string.o 00:04:27.232 CXX test/cpp_headers/thread.o 00:04:27.232 CXX test/cpp_headers/trace.o 00:04:27.232 CXX test/cpp_headers/trace_parser.o 00:04:27.497 CXX test/cpp_headers/tree.o 00:04:27.497 CXX test/cpp_headers/ublk.o 00:04:27.497 CXX test/cpp_headers/util.o 00:04:27.497 CXX test/cpp_headers/uuid.o 00:04:27.497 CXX test/cpp_headers/version.o 00:04:27.497 CXX test/cpp_headers/vfio_user_pci.o 00:04:27.497 CXX test/cpp_headers/vfio_user_spec.o 00:04:27.498 CXX test/cpp_headers/vhost.o 00:04:27.498 CXX test/cpp_headers/vmd.o 00:04:27.498 LINK abort 00:04:27.498 CXX test/cpp_headers/xor.o 00:04:27.498 CXX test/cpp_headers/zipf.o 00:04:27.498 LINK idxd_perf 00:04:27.498 LINK spdk_dd 00:04:27.498 LINK reconnect 00:04:27.498 LINK test_dma 00:04:27.498 LINK arbitration 00:04:27.498 LINK accel_perf 00:04:27.498 LINK dif 00:04:27.498 LINK bdevio 00:04:27.498 LINK nvme_manage 00:04:27.498 LINK pci_ut 00:04:27.498 LINK nvme_fuzz 00:04:27.756 LINK nvme_compliance 00:04:27.756 LINK spdk_nvme_identify 00:04:27.756 LINK blobcli 00:04:27.756 LINK llvm_vfio_fuzz 00:04:27.756 LINK memory_ut 00:04:27.756 LINK spdk_nvme 00:04:27.756 LINK spdk_bdev 00:04:28.015 LINK vhost_fuzz 00:04:28.015 LINK spdk_nvme_perf 00:04:28.015 LINK cuse 00:04:28.015 LINK bdevperf 00:04:28.015 LINK spdk_top 00:04:28.015 LINK llvm_nvme_fuzz 00:04:28.274 LINK spdk_lock 00:04:28.532 LINK iscsi_fuzz 00:04:30.541 LINK esnap 00:04:30.827 00:04:30.827 real 0m23.242s 00:04:30.827 user 4m16.900s 00:04:30.827 sys 1m58.164s 00:04:30.827 08:22:23 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:04:30.827 08:22:23 -- common/autotest_common.sh@10 -- $ set +x 00:04:30.827 ************************************ 00:04:30.827 END TEST make 00:04:30.827 ************************************ 00:04:30.827 08:22:23 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:30.827 08:22:23 -- nvmf/common.sh@7 -- # uname -s 00:04:30.827 08:22:23 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:30.827 08:22:23 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:30.827 08:22:23 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:30.827 08:22:23 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:30.827 08:22:23 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:30.827 08:22:23 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:30.827 08:22:23 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:30.827 08:22:23 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:30.827 08:22:23 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:30.827 08:22:23 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:30.827 08:22:23 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:30.827 08:22:23 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:30.827 08:22:23 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:30.827 08:22:23 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:30.827 08:22:23 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:30.827 08:22:23 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:30.827 08:22:23 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:30.827 08:22:23 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:30.827 08:22:23 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:30.827 08:22:23 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:30.827 08:22:23 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:30.827 08:22:23 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:30.827 08:22:23 -- paths/export.sh@5 -- # export PATH 00:04:30.827 08:22:23 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:30.827 08:22:23 -- nvmf/common.sh@46 -- # : 0 00:04:30.827 08:22:23 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:04:30.827 08:22:23 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:04:30.827 08:22:23 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:04:30.827 08:22:23 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:30.827 08:22:23 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:30.827 08:22:23 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:04:30.827 08:22:23 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:04:30.827 08:22:23 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:04:30.827 08:22:23 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:30.827 08:22:23 -- spdk/autotest.sh@32 -- # uname -s 00:04:30.827 08:22:23 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:30.827 08:22:23 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:30.827 08:22:23 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:30.827 08:22:23 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:30.827 08:22:23 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:30.827 08:22:23 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:30.827 08:22:23 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:30.827 08:22:23 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:30.827 08:22:23 -- spdk/autotest.sh@48 -- # udevadm_pid=943418 00:04:30.827 08:22:23 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:04:30.827 08:22:23 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:30.827 08:22:23 -- spdk/autotest.sh@54 -- # echo 943420 00:04:30.827 08:22:23 -- spdk/autotest.sh@56 -- # echo 943421 00:04:30.827 08:22:23 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:04:30.827 08:22:23 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:04:30.827 08:22:23 -- spdk/autotest.sh@60 -- # echo 943422 00:04:30.827 08:22:23 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:04:30.827 08:22:23 -- spdk/autotest.sh@62 -- # echo 943423 00:04:30.827 08:22:23 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:30.827 08:22:23 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:04:30.827 08:22:23 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:30.827 08:22:23 -- common/autotest_common.sh@10 -- # set +x 00:04:30.827 08:22:23 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:04:30.827 08:22:23 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:04:30.827 08:22:23 -- spdk/autotest.sh@70 -- # create_test_list 00:04:30.827 08:22:23 -- common/autotest_common.sh@736 -- # xtrace_disable 00:04:30.827 08:22:23 -- common/autotest_common.sh@10 -- # set +x 00:04:31.086 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:04:31.086 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:04:31.086 08:22:23 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:31.086 08:22:23 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:31.086 08:22:23 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:31.086 08:22:23 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:31.086 08:22:23 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:31.086 08:22:23 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:04:31.086 08:22:23 -- common/autotest_common.sh@1440 -- # uname 00:04:31.086 08:22:23 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:04:31.086 08:22:23 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:04:31.086 08:22:23 -- common/autotest_common.sh@1460 -- # uname 00:04:31.086 08:22:23 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:04:31.086 08:22:23 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:04:31.086 08:22:23 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:04:31.086 08:22:23 -- spdk/autotest.sh@83 -- # hash lcov 00:04:31.086 08:22:23 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:04:31.086 08:22:23 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:04:31.086 08:22:23 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:31.086 08:22:23 -- common/autotest_common.sh@10 -- # set +x 00:04:31.086 08:22:23 -- spdk/autotest.sh@102 -- # rm -f 00:04:31.086 08:22:23 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:34.387 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:34.387 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:34.646 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:34.905 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:34.905 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:34.905 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:34.905 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:34.905 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:34.905 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:34.905 08:22:27 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:04:34.905 08:22:27 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:34.905 08:22:27 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:34.905 08:22:27 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:34.905 08:22:27 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:34.905 08:22:27 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:34.905 08:22:27 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:34.905 08:22:27 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:34.905 08:22:27 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:34.905 08:22:27 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:04:34.905 08:22:27 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:04:34.905 08:22:27 -- spdk/autotest.sh@121 -- # grep -v p 00:04:34.905 08:22:27 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:04:34.905 08:22:27 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:04:34.905 08:22:27 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:04:34.905 08:22:27 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:04:34.905 08:22:27 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:34.905 No valid GPT data, bailing 00:04:34.905 08:22:27 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:34.905 08:22:27 -- scripts/common.sh@393 -- # pt= 00:04:34.905 08:22:27 -- scripts/common.sh@394 -- # return 1 00:04:34.905 08:22:27 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:34.905 1+0 records in 00:04:34.905 1+0 records out 00:04:34.905 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00441072 s, 238 MB/s 00:04:34.905 08:22:27 -- spdk/autotest.sh@129 -- # sync 00:04:35.165 08:22:27 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:35.165 08:22:27 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:35.165 08:22:27 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:41.739 08:22:33 -- spdk/autotest.sh@135 -- # uname -s 00:04:41.739 08:22:33 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:04:41.739 08:22:33 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:41.739 08:22:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:41.739 08:22:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:41.739 08:22:33 -- common/autotest_common.sh@10 -- # set +x 00:04:41.739 ************************************ 00:04:41.739 START TEST setup.sh 00:04:41.739 ************************************ 00:04:41.739 08:22:33 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:41.739 * Looking for test storage... 00:04:41.739 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:41.739 08:22:33 -- setup/test-setup.sh@10 -- # uname -s 00:04:41.739 08:22:33 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:41.739 08:22:33 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:41.739 08:22:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:41.739 08:22:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:41.739 08:22:33 -- common/autotest_common.sh@10 -- # set +x 00:04:41.739 ************************************ 00:04:41.739 START TEST acl 00:04:41.739 ************************************ 00:04:41.739 08:22:33 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:41.739 * Looking for test storage... 00:04:41.739 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:41.739 08:22:33 -- setup/acl.sh@10 -- # get_zoned_devs 00:04:41.739 08:22:33 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:41.739 08:22:33 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:41.739 08:22:33 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:41.739 08:22:33 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:41.739 08:22:33 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:41.739 08:22:33 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:41.739 08:22:33 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:41.739 08:22:33 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:41.739 08:22:33 -- setup/acl.sh@12 -- # devs=() 00:04:41.739 08:22:33 -- setup/acl.sh@12 -- # declare -a devs 00:04:41.739 08:22:33 -- setup/acl.sh@13 -- # drivers=() 00:04:41.739 08:22:33 -- setup/acl.sh@13 -- # declare -A drivers 00:04:41.739 08:22:33 -- setup/acl.sh@51 -- # setup reset 00:04:41.739 08:22:33 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:41.739 08:22:33 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:44.280 08:22:36 -- setup/acl.sh@52 -- # collect_setup_devs 00:04:44.280 08:22:36 -- setup/acl.sh@16 -- # local dev driver 00:04:44.280 08:22:36 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:44.280 08:22:36 -- setup/acl.sh@15 -- # setup output status 00:04:44.280 08:22:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:44.280 08:22:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:46.817 Hugepages 00:04:46.817 node hugesize free / total 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 00:04:46.817 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.817 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.817 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.817 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.818 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.818 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.818 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.818 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.818 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.818 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.818 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.818 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.818 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.818 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.818 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.818 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.818 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.818 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.818 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.818 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:46.818 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:47.077 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:47.077 08:22:39 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:47.077 08:22:39 -- setup/acl.sh@20 -- # continue 00:04:47.077 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:47.077 08:22:39 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:47.077 08:22:39 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:47.077 08:22:39 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:47.077 08:22:39 -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:47.077 08:22:39 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:47.077 08:22:39 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:47.077 08:22:39 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:47.077 08:22:39 -- setup/acl.sh@54 -- # run_test denied denied 00:04:47.077 08:22:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:47.077 08:22:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:47.077 08:22:39 -- common/autotest_common.sh@10 -- # set +x 00:04:47.077 ************************************ 00:04:47.077 START TEST denied 00:04:47.077 ************************************ 00:04:47.077 08:22:39 -- common/autotest_common.sh@1104 -- # denied 00:04:47.077 08:22:39 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:47.077 08:22:39 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:47.077 08:22:39 -- setup/acl.sh@38 -- # setup output config 00:04:47.077 08:22:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.077 08:22:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:51.276 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:51.276 08:22:43 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:51.276 08:22:43 -- setup/acl.sh@28 -- # local dev driver 00:04:51.276 08:22:43 -- setup/acl.sh@30 -- # for dev in "$@" 00:04:51.276 08:22:43 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:51.276 08:22:43 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:51.276 08:22:43 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:51.276 08:22:43 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:51.276 08:22:43 -- setup/acl.sh@41 -- # setup reset 00:04:51.276 08:22:43 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:51.276 08:22:43 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:55.473 00:04:55.473 real 0m8.238s 00:04:55.473 user 0m2.768s 00:04:55.473 sys 0m4.839s 00:04:55.473 08:22:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:55.473 08:22:47 -- common/autotest_common.sh@10 -- # set +x 00:04:55.473 ************************************ 00:04:55.473 END TEST denied 00:04:55.473 ************************************ 00:04:55.473 08:22:47 -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:55.473 08:22:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:55.473 08:22:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:55.473 08:22:47 -- common/autotest_common.sh@10 -- # set +x 00:04:55.473 ************************************ 00:04:55.473 START TEST allowed 00:04:55.473 ************************************ 00:04:55.473 08:22:47 -- common/autotest_common.sh@1104 -- # allowed 00:04:55.473 08:22:47 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:55.473 08:22:47 -- setup/acl.sh@45 -- # setup output config 00:04:55.473 08:22:47 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:55.473 08:22:47 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.473 08:22:47 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:00.748 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:00.748 08:22:52 -- setup/acl.sh@47 -- # verify 00:05:00.748 08:22:52 -- setup/acl.sh@28 -- # local dev driver 00:05:00.748 08:22:52 -- setup/acl.sh@48 -- # setup reset 00:05:00.748 08:22:52 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:00.748 08:22:52 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:04.941 00:05:04.941 real 0m8.851s 00:05:04.941 user 0m2.496s 00:05:04.941 sys 0m4.905s 00:05:04.941 08:22:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.941 08:22:56 -- common/autotest_common.sh@10 -- # set +x 00:05:04.941 ************************************ 00:05:04.941 END TEST allowed 00:05:04.941 ************************************ 00:05:04.941 00:05:04.941 real 0m23.174s 00:05:04.941 user 0m7.181s 00:05:04.941 sys 0m13.797s 00:05:04.941 08:22:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.941 08:22:56 -- common/autotest_common.sh@10 -- # set +x 00:05:04.941 ************************************ 00:05:04.941 END TEST acl 00:05:04.941 ************************************ 00:05:04.941 08:22:56 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:04.941 08:22:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:04.941 08:22:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:04.941 08:22:56 -- common/autotest_common.sh@10 -- # set +x 00:05:04.941 ************************************ 00:05:04.941 START TEST hugepages 00:05:04.941 ************************************ 00:05:04.941 08:22:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:04.941 * Looking for test storage... 00:05:04.941 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:04.941 08:22:56 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:04.941 08:22:56 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:04.941 08:22:56 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:04.941 08:22:56 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:04.941 08:22:56 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:04.941 08:22:56 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:04.941 08:22:56 -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:04.941 08:22:56 -- setup/common.sh@18 -- # local node= 00:05:04.941 08:22:56 -- setup/common.sh@19 -- # local var val 00:05:04.941 08:22:56 -- setup/common.sh@20 -- # local mem_f mem 00:05:04.941 08:22:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.941 08:22:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.941 08:22:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.941 08:22:56 -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.941 08:22:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.941 08:22:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 38071272 kB' 'MemAvailable: 41938812 kB' 'Buffers: 2708 kB' 'Cached: 13598656 kB' 'SwapCached: 0 kB' 'Active: 10329952 kB' 'Inactive: 3768020 kB' 'Active(anon): 9871148 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 499996 kB' 'Mapped: 211004 kB' 'Shmem: 9374540 kB' 'KReclaimable: 275164 kB' 'Slab: 1242680 kB' 'SReclaimable: 275164 kB' 'SUnreclaim: 967516 kB' 'KernelStack: 21840 kB' 'PageTables: 8684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433340 kB' 'Committed_AS: 11093236 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216740 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.941 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.941 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.942 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.942 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # continue 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.943 08:22:56 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.943 08:22:56 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.943 08:22:56 -- setup/common.sh@33 -- # echo 2048 00:05:04.943 08:22:56 -- setup/common.sh@33 -- # return 0 00:05:04.943 08:22:56 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:04.943 08:22:56 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:04.943 08:22:56 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:04.943 08:22:56 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:05:04.943 08:22:56 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:05:04.943 08:22:56 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:05:04.943 08:22:56 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:05:04.943 08:22:56 -- setup/hugepages.sh@207 -- # get_nodes 00:05:04.943 08:22:56 -- setup/hugepages.sh@27 -- # local node 00:05:04.943 08:22:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.943 08:22:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:05:04.943 08:22:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.943 08:22:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:04.943 08:22:56 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:04.943 08:22:56 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:04.943 08:22:56 -- setup/hugepages.sh@208 -- # clear_hp 00:05:04.943 08:22:56 -- setup/hugepages.sh@37 -- # local node hp 00:05:04.943 08:22:56 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:04.943 08:22:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:04.943 08:22:56 -- setup/hugepages.sh@41 -- # echo 0 00:05:04.943 08:22:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:04.943 08:22:56 -- setup/hugepages.sh@41 -- # echo 0 00:05:04.943 08:22:56 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:04.943 08:22:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:04.943 08:22:56 -- setup/hugepages.sh@41 -- # echo 0 00:05:04.943 08:22:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:04.943 08:22:56 -- setup/hugepages.sh@41 -- # echo 0 00:05:04.943 08:22:56 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:04.943 08:22:56 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:04.943 08:22:56 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:05:04.943 08:22:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:04.943 08:22:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:04.943 08:22:56 -- common/autotest_common.sh@10 -- # set +x 00:05:04.943 ************************************ 00:05:04.943 START TEST default_setup 00:05:04.943 ************************************ 00:05:04.943 08:22:56 -- common/autotest_common.sh@1104 -- # default_setup 00:05:04.943 08:22:56 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:05:04.943 08:22:56 -- setup/hugepages.sh@49 -- # local size=2097152 00:05:04.943 08:22:56 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:04.943 08:22:56 -- setup/hugepages.sh@51 -- # shift 00:05:04.943 08:22:56 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:04.943 08:22:56 -- setup/hugepages.sh@52 -- # local node_ids 00:05:04.943 08:22:56 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:04.943 08:22:56 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:04.943 08:22:56 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:04.943 08:22:56 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:04.943 08:22:56 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:04.943 08:22:56 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:04.943 08:22:56 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:04.943 08:22:56 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:04.943 08:22:56 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:04.943 08:22:56 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:04.943 08:22:56 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:04.943 08:22:56 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:04.943 08:22:56 -- setup/hugepages.sh@73 -- # return 0 00:05:04.943 08:22:56 -- setup/hugepages.sh@137 -- # setup output 00:05:04.943 08:22:56 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.943 08:22:56 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:08.229 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:08.229 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:08.229 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:08.229 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:08.229 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:08.230 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:09.610 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:09.610 08:23:02 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:05:09.610 08:23:02 -- setup/hugepages.sh@89 -- # local node 00:05:09.610 08:23:02 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:09.610 08:23:02 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:09.610 08:23:02 -- setup/hugepages.sh@92 -- # local surp 00:05:09.610 08:23:02 -- setup/hugepages.sh@93 -- # local resv 00:05:09.611 08:23:02 -- setup/hugepages.sh@94 -- # local anon 00:05:09.611 08:23:02 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:09.611 08:23:02 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:09.611 08:23:02 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:09.611 08:23:02 -- setup/common.sh@18 -- # local node= 00:05:09.611 08:23:02 -- setup/common.sh@19 -- # local var val 00:05:09.611 08:23:02 -- setup/common.sh@20 -- # local mem_f mem 00:05:09.611 08:23:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.611 08:23:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.611 08:23:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.611 08:23:02 -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.611 08:23:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40303200 kB' 'MemAvailable: 44170704 kB' 'Buffers: 2708 kB' 'Cached: 13598780 kB' 'SwapCached: 0 kB' 'Active: 10339756 kB' 'Inactive: 3768020 kB' 'Active(anon): 9880952 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509672 kB' 'Mapped: 210192 kB' 'Shmem: 9374664 kB' 'KReclaimable: 275092 kB' 'Slab: 1240776 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965684 kB' 'KernelStack: 21936 kB' 'PageTables: 8472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11103444 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217024 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.611 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.611 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:09.612 08:23:02 -- setup/common.sh@33 -- # echo 0 00:05:09.612 08:23:02 -- setup/common.sh@33 -- # return 0 00:05:09.612 08:23:02 -- setup/hugepages.sh@97 -- # anon=0 00:05:09.612 08:23:02 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:09.612 08:23:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:09.612 08:23:02 -- setup/common.sh@18 -- # local node= 00:05:09.612 08:23:02 -- setup/common.sh@19 -- # local var val 00:05:09.612 08:23:02 -- setup/common.sh@20 -- # local mem_f mem 00:05:09.612 08:23:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.612 08:23:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.612 08:23:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.612 08:23:02 -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.612 08:23:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40304620 kB' 'MemAvailable: 44172124 kB' 'Buffers: 2708 kB' 'Cached: 13598784 kB' 'SwapCached: 0 kB' 'Active: 10340028 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881224 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510028 kB' 'Mapped: 210192 kB' 'Shmem: 9374668 kB' 'KReclaimable: 275092 kB' 'Slab: 1240732 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965640 kB' 'KernelStack: 21872 kB' 'PageTables: 8164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11103452 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217088 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.612 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.612 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.875 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.875 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.876 08:23:02 -- setup/common.sh@33 -- # echo 0 00:05:09.876 08:23:02 -- setup/common.sh@33 -- # return 0 00:05:09.876 08:23:02 -- setup/hugepages.sh@99 -- # surp=0 00:05:09.876 08:23:02 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:09.876 08:23:02 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:09.876 08:23:02 -- setup/common.sh@18 -- # local node= 00:05:09.876 08:23:02 -- setup/common.sh@19 -- # local var val 00:05:09.876 08:23:02 -- setup/common.sh@20 -- # local mem_f mem 00:05:09.876 08:23:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.876 08:23:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.876 08:23:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.876 08:23:02 -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.876 08:23:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40307584 kB' 'MemAvailable: 44175088 kB' 'Buffers: 2708 kB' 'Cached: 13598796 kB' 'SwapCached: 0 kB' 'Active: 10340624 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881820 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510564 kB' 'Mapped: 210244 kB' 'Shmem: 9374680 kB' 'KReclaimable: 275092 kB' 'Slab: 1240796 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965704 kB' 'KernelStack: 22272 kB' 'PageTables: 9192 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11102076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217120 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.876 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.876 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:09.877 08:23:02 -- setup/common.sh@33 -- # echo 0 00:05:09.877 08:23:02 -- setup/common.sh@33 -- # return 0 00:05:09.877 08:23:02 -- setup/hugepages.sh@100 -- # resv=0 00:05:09.877 08:23:02 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:09.877 nr_hugepages=1024 00:05:09.877 08:23:02 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:09.877 resv_hugepages=0 00:05:09.877 08:23:02 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:09.877 surplus_hugepages=0 00:05:09.877 08:23:02 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:09.877 anon_hugepages=0 00:05:09.877 08:23:02 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:09.877 08:23:02 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:09.877 08:23:02 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:09.877 08:23:02 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:09.877 08:23:02 -- setup/common.sh@18 -- # local node= 00:05:09.877 08:23:02 -- setup/common.sh@19 -- # local var val 00:05:09.877 08:23:02 -- setup/common.sh@20 -- # local mem_f mem 00:05:09.877 08:23:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.877 08:23:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.877 08:23:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.877 08:23:02 -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.877 08:23:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40307712 kB' 'MemAvailable: 44175216 kB' 'Buffers: 2708 kB' 'Cached: 13598812 kB' 'SwapCached: 0 kB' 'Active: 10340352 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881548 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510116 kB' 'Mapped: 210244 kB' 'Shmem: 9374696 kB' 'KReclaimable: 275092 kB' 'Slab: 1240788 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965696 kB' 'KernelStack: 22144 kB' 'PageTables: 8908 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11103484 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217104 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.877 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.877 08:23:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.878 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.878 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:09.879 08:23:02 -- setup/common.sh@33 -- # echo 1024 00:05:09.879 08:23:02 -- setup/common.sh@33 -- # return 0 00:05:09.879 08:23:02 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:09.879 08:23:02 -- setup/hugepages.sh@112 -- # get_nodes 00:05:09.879 08:23:02 -- setup/hugepages.sh@27 -- # local node 00:05:09.879 08:23:02 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:09.879 08:23:02 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:09.879 08:23:02 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:09.879 08:23:02 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:09.879 08:23:02 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:09.879 08:23:02 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:09.879 08:23:02 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:09.879 08:23:02 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:09.879 08:23:02 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:09.879 08:23:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:09.879 08:23:02 -- setup/common.sh@18 -- # local node=0 00:05:09.879 08:23:02 -- setup/common.sh@19 -- # local var val 00:05:09.879 08:23:02 -- setup/common.sh@20 -- # local mem_f mem 00:05:09.879 08:23:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.879 08:23:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:09.879 08:23:02 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:09.879 08:23:02 -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.879 08:23:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 23772924 kB' 'MemUsed: 8812444 kB' 'SwapCached: 0 kB' 'Active: 4699560 kB' 'Inactive: 253652 kB' 'Active(anon): 4451400 kB' 'Inactive(anon): 0 kB' 'Active(file): 248160 kB' 'Inactive(file): 253652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4635400 kB' 'Mapped: 53144 kB' 'AnonPages: 320912 kB' 'Shmem: 4133588 kB' 'KernelStack: 12200 kB' 'PageTables: 5800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122792 kB' 'Slab: 586136 kB' 'SReclaimable: 122792 kB' 'SUnreclaim: 463344 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.879 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.879 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # continue 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # IFS=': ' 00:05:09.880 08:23:02 -- setup/common.sh@31 -- # read -r var val _ 00:05:09.880 08:23:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:09.880 08:23:02 -- setup/common.sh@33 -- # echo 0 00:05:09.880 08:23:02 -- setup/common.sh@33 -- # return 0 00:05:09.880 08:23:02 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:09.880 08:23:02 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:09.880 08:23:02 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:09.880 08:23:02 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:09.880 08:23:02 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:09.880 node0=1024 expecting 1024 00:05:09.880 08:23:02 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:09.880 00:05:09.880 real 0m5.436s 00:05:09.880 user 0m1.443s 00:05:09.880 sys 0m2.478s 00:05:09.880 08:23:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.880 08:23:02 -- common/autotest_common.sh@10 -- # set +x 00:05:09.880 ************************************ 00:05:09.880 END TEST default_setup 00:05:09.880 ************************************ 00:05:09.880 08:23:02 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:05:09.880 08:23:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:09.880 08:23:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:09.880 08:23:02 -- common/autotest_common.sh@10 -- # set +x 00:05:09.880 ************************************ 00:05:09.880 START TEST per_node_1G_alloc 00:05:09.880 ************************************ 00:05:09.880 08:23:02 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:05:09.880 08:23:02 -- setup/hugepages.sh@143 -- # local IFS=, 00:05:09.880 08:23:02 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:05:09.880 08:23:02 -- setup/hugepages.sh@49 -- # local size=1048576 00:05:09.880 08:23:02 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:05:09.880 08:23:02 -- setup/hugepages.sh@51 -- # shift 00:05:09.880 08:23:02 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:05:09.880 08:23:02 -- setup/hugepages.sh@52 -- # local node_ids 00:05:09.880 08:23:02 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:09.880 08:23:02 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:09.880 08:23:02 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:05:09.880 08:23:02 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:05:09.880 08:23:02 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:09.880 08:23:02 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:09.880 08:23:02 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:09.880 08:23:02 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:09.880 08:23:02 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:09.880 08:23:02 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:05:09.880 08:23:02 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:09.880 08:23:02 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:09.880 08:23:02 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:09.880 08:23:02 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:09.880 08:23:02 -- setup/hugepages.sh@73 -- # return 0 00:05:09.880 08:23:02 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:05:09.880 08:23:02 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:05:09.880 08:23:02 -- setup/hugepages.sh@146 -- # setup output 00:05:09.880 08:23:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:09.880 08:23:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:13.167 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:13.167 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:13.429 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:13.429 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:13.429 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:13.429 08:23:05 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:05:13.429 08:23:05 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:05:13.429 08:23:05 -- setup/hugepages.sh@89 -- # local node 00:05:13.429 08:23:05 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:13.429 08:23:05 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:13.429 08:23:05 -- setup/hugepages.sh@92 -- # local surp 00:05:13.429 08:23:05 -- setup/hugepages.sh@93 -- # local resv 00:05:13.429 08:23:05 -- setup/hugepages.sh@94 -- # local anon 00:05:13.429 08:23:05 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:13.429 08:23:05 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:13.429 08:23:05 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:13.429 08:23:05 -- setup/common.sh@18 -- # local node= 00:05:13.429 08:23:05 -- setup/common.sh@19 -- # local var val 00:05:13.429 08:23:05 -- setup/common.sh@20 -- # local mem_f mem 00:05:13.429 08:23:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.429 08:23:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.429 08:23:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.429 08:23:05 -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.429 08:23:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.429 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.429 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40307928 kB' 'MemAvailable: 44175432 kB' 'Buffers: 2708 kB' 'Cached: 13599068 kB' 'SwapCached: 0 kB' 'Active: 10339500 kB' 'Inactive: 3768020 kB' 'Active(anon): 9880696 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508648 kB' 'Mapped: 209340 kB' 'Shmem: 9374952 kB' 'KReclaimable: 275092 kB' 'Slab: 1240852 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965760 kB' 'KernelStack: 21856 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11092872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217024 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.430 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.430 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.431 08:23:05 -- setup/common.sh@33 -- # echo 0 00:05:13.431 08:23:05 -- setup/common.sh@33 -- # return 0 00:05:13.431 08:23:05 -- setup/hugepages.sh@97 -- # anon=0 00:05:13.431 08:23:05 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:13.431 08:23:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:13.431 08:23:05 -- setup/common.sh@18 -- # local node= 00:05:13.431 08:23:05 -- setup/common.sh@19 -- # local var val 00:05:13.431 08:23:05 -- setup/common.sh@20 -- # local mem_f mem 00:05:13.431 08:23:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.431 08:23:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.431 08:23:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.431 08:23:05 -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.431 08:23:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40308040 kB' 'MemAvailable: 44175544 kB' 'Buffers: 2708 kB' 'Cached: 13599068 kB' 'SwapCached: 0 kB' 'Active: 10339720 kB' 'Inactive: 3768020 kB' 'Active(anon): 9880916 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508924 kB' 'Mapped: 209292 kB' 'Shmem: 9374952 kB' 'KReclaimable: 275092 kB' 'Slab: 1240852 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965760 kB' 'KernelStack: 21872 kB' 'PageTables: 8416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11092884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216992 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:05 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.431 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.431 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.432 08:23:06 -- setup/common.sh@33 -- # echo 0 00:05:13.432 08:23:06 -- setup/common.sh@33 -- # return 0 00:05:13.432 08:23:06 -- setup/hugepages.sh@99 -- # surp=0 00:05:13.432 08:23:06 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:13.432 08:23:06 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:13.432 08:23:06 -- setup/common.sh@18 -- # local node= 00:05:13.432 08:23:06 -- setup/common.sh@19 -- # local var val 00:05:13.432 08:23:06 -- setup/common.sh@20 -- # local mem_f mem 00:05:13.432 08:23:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.432 08:23:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.432 08:23:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.432 08:23:06 -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.432 08:23:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40307672 kB' 'MemAvailable: 44175176 kB' 'Buffers: 2708 kB' 'Cached: 13599080 kB' 'SwapCached: 0 kB' 'Active: 10339900 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881096 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509524 kB' 'Mapped: 209716 kB' 'Shmem: 9374964 kB' 'KReclaimable: 275092 kB' 'Slab: 1240820 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965728 kB' 'KernelStack: 21808 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11094784 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216992 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.432 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.432 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.433 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.433 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.434 08:23:06 -- setup/common.sh@33 -- # echo 0 00:05:13.434 08:23:06 -- setup/common.sh@33 -- # return 0 00:05:13.434 08:23:06 -- setup/hugepages.sh@100 -- # resv=0 00:05:13.434 08:23:06 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:13.434 nr_hugepages=1024 00:05:13.434 08:23:06 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:13.434 resv_hugepages=0 00:05:13.434 08:23:06 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:13.434 surplus_hugepages=0 00:05:13.434 08:23:06 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:13.434 anon_hugepages=0 00:05:13.434 08:23:06 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:13.434 08:23:06 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:13.434 08:23:06 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:13.434 08:23:06 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:13.434 08:23:06 -- setup/common.sh@18 -- # local node= 00:05:13.434 08:23:06 -- setup/common.sh@19 -- # local var val 00:05:13.434 08:23:06 -- setup/common.sh@20 -- # local mem_f mem 00:05:13.434 08:23:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.434 08:23:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.434 08:23:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.434 08:23:06 -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.434 08:23:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40299888 kB' 'MemAvailable: 44167392 kB' 'Buffers: 2708 kB' 'Cached: 13599096 kB' 'SwapCached: 0 kB' 'Active: 10344028 kB' 'Inactive: 3768020 kB' 'Active(anon): 9885224 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513596 kB' 'Mapped: 209716 kB' 'Shmem: 9374980 kB' 'KReclaimable: 275092 kB' 'Slab: 1240820 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965728 kB' 'KernelStack: 21856 kB' 'PageTables: 8348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11098096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216992 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.434 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.434 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.435 08:23:06 -- setup/common.sh@33 -- # echo 1024 00:05:13.435 08:23:06 -- setup/common.sh@33 -- # return 0 00:05:13.435 08:23:06 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:13.435 08:23:06 -- setup/hugepages.sh@112 -- # get_nodes 00:05:13.435 08:23:06 -- setup/hugepages.sh@27 -- # local node 00:05:13.435 08:23:06 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:13.435 08:23:06 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:13.435 08:23:06 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:13.435 08:23:06 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:13.435 08:23:06 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:13.435 08:23:06 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:13.435 08:23:06 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:13.435 08:23:06 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:13.435 08:23:06 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:13.435 08:23:06 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:13.435 08:23:06 -- setup/common.sh@18 -- # local node=0 00:05:13.435 08:23:06 -- setup/common.sh@19 -- # local var val 00:05:13.435 08:23:06 -- setup/common.sh@20 -- # local mem_f mem 00:05:13.435 08:23:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.435 08:23:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:13.435 08:23:06 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:13.435 08:23:06 -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.435 08:23:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 24816968 kB' 'MemUsed: 7768400 kB' 'SwapCached: 0 kB' 'Active: 4698764 kB' 'Inactive: 253652 kB' 'Active(anon): 4450604 kB' 'Inactive(anon): 0 kB' 'Active(file): 248160 kB' 'Inactive(file): 253652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4635452 kB' 'Mapped: 52640 kB' 'AnonPages: 320124 kB' 'Shmem: 4133640 kB' 'KernelStack: 11896 kB' 'PageTables: 5220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122792 kB' 'Slab: 586268 kB' 'SReclaimable: 122792 kB' 'SUnreclaim: 463476 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.435 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.435 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.436 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.436 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.768 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.768 08:23:06 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.768 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.768 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.768 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.768 08:23:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.768 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.768 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.768 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.768 08:23:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.768 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.768 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.768 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@33 -- # echo 0 00:05:13.769 08:23:06 -- setup/common.sh@33 -- # return 0 00:05:13.769 08:23:06 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:13.769 08:23:06 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:13.769 08:23:06 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:13.769 08:23:06 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:13.769 08:23:06 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:13.769 08:23:06 -- setup/common.sh@18 -- # local node=1 00:05:13.769 08:23:06 -- setup/common.sh@19 -- # local var val 00:05:13.769 08:23:06 -- setup/common.sh@20 -- # local mem_f mem 00:05:13.769 08:23:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.769 08:23:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:13.769 08:23:06 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:13.769 08:23:06 -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.769 08:23:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.769 08:23:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698412 kB' 'MemFree: 15483896 kB' 'MemUsed: 12214516 kB' 'SwapCached: 0 kB' 'Active: 5645412 kB' 'Inactive: 3514368 kB' 'Active(anon): 5434768 kB' 'Inactive(anon): 0 kB' 'Active(file): 210644 kB' 'Inactive(file): 3514368 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8966368 kB' 'Mapped: 157076 kB' 'AnonPages: 193612 kB' 'Shmem: 5241356 kB' 'KernelStack: 9944 kB' 'PageTables: 3080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 152300 kB' 'Slab: 654552 kB' 'SReclaimable: 152300 kB' 'SUnreclaim: 502252 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.769 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.769 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # continue 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # IFS=': ' 00:05:13.770 08:23:06 -- setup/common.sh@31 -- # read -r var val _ 00:05:13.770 08:23:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.770 08:23:06 -- setup/common.sh@33 -- # echo 0 00:05:13.770 08:23:06 -- setup/common.sh@33 -- # return 0 00:05:13.770 08:23:06 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:13.770 08:23:06 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:13.770 08:23:06 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:13.770 08:23:06 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:13.770 08:23:06 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:13.770 node0=512 expecting 512 00:05:13.770 08:23:06 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:13.770 08:23:06 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:13.770 08:23:06 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:13.770 08:23:06 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:13.770 node1=512 expecting 512 00:05:13.770 08:23:06 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:13.770 00:05:13.770 real 0m3.674s 00:05:13.770 user 0m1.405s 00:05:13.770 sys 0m2.338s 00:05:13.770 08:23:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:13.770 08:23:06 -- common/autotest_common.sh@10 -- # set +x 00:05:13.770 ************************************ 00:05:13.770 END TEST per_node_1G_alloc 00:05:13.770 ************************************ 00:05:13.770 08:23:06 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:05:13.770 08:23:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:13.770 08:23:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:13.770 08:23:06 -- common/autotest_common.sh@10 -- # set +x 00:05:13.770 ************************************ 00:05:13.770 START TEST even_2G_alloc 00:05:13.770 ************************************ 00:05:13.770 08:23:06 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:05:13.770 08:23:06 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:05:13.770 08:23:06 -- setup/hugepages.sh@49 -- # local size=2097152 00:05:13.770 08:23:06 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:13.770 08:23:06 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:13.770 08:23:06 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:13.770 08:23:06 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:13.770 08:23:06 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:13.770 08:23:06 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:13.770 08:23:06 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:13.770 08:23:06 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:13.770 08:23:06 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:13.770 08:23:06 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:13.770 08:23:06 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:13.770 08:23:06 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:13.770 08:23:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:13.770 08:23:06 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:13.770 08:23:06 -- setup/hugepages.sh@83 -- # : 512 00:05:13.770 08:23:06 -- setup/hugepages.sh@84 -- # : 1 00:05:13.770 08:23:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:13.770 08:23:06 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:13.770 08:23:06 -- setup/hugepages.sh@83 -- # : 0 00:05:13.770 08:23:06 -- setup/hugepages.sh@84 -- # : 0 00:05:13.770 08:23:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:13.770 08:23:06 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:05:13.770 08:23:06 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:05:13.770 08:23:06 -- setup/hugepages.sh@153 -- # setup output 00:05:13.770 08:23:06 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:13.770 08:23:06 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:17.083 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.083 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:17.083 08:23:09 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:05:17.083 08:23:09 -- setup/hugepages.sh@89 -- # local node 00:05:17.083 08:23:09 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:17.083 08:23:09 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:17.083 08:23:09 -- setup/hugepages.sh@92 -- # local surp 00:05:17.084 08:23:09 -- setup/hugepages.sh@93 -- # local resv 00:05:17.084 08:23:09 -- setup/hugepages.sh@94 -- # local anon 00:05:17.084 08:23:09 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:17.084 08:23:09 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:17.084 08:23:09 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:17.084 08:23:09 -- setup/common.sh@18 -- # local node= 00:05:17.084 08:23:09 -- setup/common.sh@19 -- # local var val 00:05:17.084 08:23:09 -- setup/common.sh@20 -- # local mem_f mem 00:05:17.084 08:23:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.084 08:23:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.084 08:23:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.084 08:23:09 -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.084 08:23:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40388020 kB' 'MemAvailable: 44255524 kB' 'Buffers: 2708 kB' 'Cached: 13599192 kB' 'SwapCached: 0 kB' 'Active: 10344824 kB' 'Inactive: 3768020 kB' 'Active(anon): 9886020 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513720 kB' 'Mapped: 209828 kB' 'Shmem: 9375076 kB' 'KReclaimable: 275092 kB' 'Slab: 1240864 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965772 kB' 'KernelStack: 21872 kB' 'PageTables: 8436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11098308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216976 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.084 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.084 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.085 08:23:09 -- setup/common.sh@33 -- # echo 0 00:05:17.085 08:23:09 -- setup/common.sh@33 -- # return 0 00:05:17.085 08:23:09 -- setup/hugepages.sh@97 -- # anon=0 00:05:17.085 08:23:09 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:17.085 08:23:09 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.085 08:23:09 -- setup/common.sh@18 -- # local node= 00:05:17.085 08:23:09 -- setup/common.sh@19 -- # local var val 00:05:17.085 08:23:09 -- setup/common.sh@20 -- # local mem_f mem 00:05:17.085 08:23:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.085 08:23:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.085 08:23:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.085 08:23:09 -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.085 08:23:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40390388 kB' 'MemAvailable: 44257892 kB' 'Buffers: 2708 kB' 'Cached: 13599196 kB' 'SwapCached: 0 kB' 'Active: 10339924 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881120 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509264 kB' 'Mapped: 209560 kB' 'Shmem: 9375080 kB' 'KReclaimable: 275092 kB' 'Slab: 1240808 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965716 kB' 'KernelStack: 21824 kB' 'PageTables: 8272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11093532 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216960 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.085 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.085 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.086 08:23:09 -- setup/common.sh@33 -- # echo 0 00:05:17.086 08:23:09 -- setup/common.sh@33 -- # return 0 00:05:17.086 08:23:09 -- setup/hugepages.sh@99 -- # surp=0 00:05:17.086 08:23:09 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:17.086 08:23:09 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:17.086 08:23:09 -- setup/common.sh@18 -- # local node= 00:05:17.086 08:23:09 -- setup/common.sh@19 -- # local var val 00:05:17.086 08:23:09 -- setup/common.sh@20 -- # local mem_f mem 00:05:17.086 08:23:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.086 08:23:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.086 08:23:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.086 08:23:09 -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.086 08:23:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40390696 kB' 'MemAvailable: 44258200 kB' 'Buffers: 2708 kB' 'Cached: 13599196 kB' 'SwapCached: 0 kB' 'Active: 10339732 kB' 'Inactive: 3768020 kB' 'Active(anon): 9880928 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509080 kB' 'Mapped: 209220 kB' 'Shmem: 9375080 kB' 'KReclaimable: 275092 kB' 'Slab: 1240808 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965716 kB' 'KernelStack: 21808 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11093548 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216944 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.086 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.086 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.087 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.087 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.088 08:23:09 -- setup/common.sh@33 -- # echo 0 00:05:17.088 08:23:09 -- setup/common.sh@33 -- # return 0 00:05:17.088 08:23:09 -- setup/hugepages.sh@100 -- # resv=0 00:05:17.088 08:23:09 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:17.088 nr_hugepages=1024 00:05:17.088 08:23:09 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:17.088 resv_hugepages=0 00:05:17.088 08:23:09 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:17.088 surplus_hugepages=0 00:05:17.088 08:23:09 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:17.088 anon_hugepages=0 00:05:17.088 08:23:09 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:17.088 08:23:09 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:17.088 08:23:09 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:17.088 08:23:09 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:17.088 08:23:09 -- setup/common.sh@18 -- # local node= 00:05:17.088 08:23:09 -- setup/common.sh@19 -- # local var val 00:05:17.088 08:23:09 -- setup/common.sh@20 -- # local mem_f mem 00:05:17.088 08:23:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.088 08:23:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.088 08:23:09 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.088 08:23:09 -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.088 08:23:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40391688 kB' 'MemAvailable: 44259192 kB' 'Buffers: 2708 kB' 'Cached: 13599220 kB' 'SwapCached: 0 kB' 'Active: 10339856 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881052 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509208 kB' 'Mapped: 209220 kB' 'Shmem: 9375104 kB' 'KReclaimable: 275092 kB' 'Slab: 1240800 kB' 'SReclaimable: 275092 kB' 'SUnreclaim: 965708 kB' 'KernelStack: 21840 kB' 'PageTables: 8296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11093560 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216944 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.088 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.088 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.089 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.089 08:23:09 -- setup/common.sh@33 -- # echo 1024 00:05:17.089 08:23:09 -- setup/common.sh@33 -- # return 0 00:05:17.089 08:23:09 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:17.089 08:23:09 -- setup/hugepages.sh@112 -- # get_nodes 00:05:17.089 08:23:09 -- setup/hugepages.sh@27 -- # local node 00:05:17.089 08:23:09 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.089 08:23:09 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:17.089 08:23:09 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.089 08:23:09 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:17.089 08:23:09 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:17.089 08:23:09 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:17.089 08:23:09 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:17.089 08:23:09 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:17.089 08:23:09 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:17.089 08:23:09 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.089 08:23:09 -- setup/common.sh@18 -- # local node=0 00:05:17.089 08:23:09 -- setup/common.sh@19 -- # local var val 00:05:17.089 08:23:09 -- setup/common.sh@20 -- # local mem_f mem 00:05:17.089 08:23:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.089 08:23:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:17.089 08:23:09 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:17.089 08:23:09 -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.089 08:23:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.089 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 24888176 kB' 'MemUsed: 7697192 kB' 'SwapCached: 0 kB' 'Active: 4698796 kB' 'Inactive: 253652 kB' 'Active(anon): 4450636 kB' 'Inactive(anon): 0 kB' 'Active(file): 248160 kB' 'Inactive(file): 253652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4635532 kB' 'Mapped: 52648 kB' 'AnonPages: 320044 kB' 'Shmem: 4133720 kB' 'KernelStack: 11896 kB' 'PageTables: 5176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122792 kB' 'Slab: 586236 kB' 'SReclaimable: 122792 kB' 'SUnreclaim: 463444 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.090 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.090 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.090 08:23:09 -- setup/common.sh@33 -- # echo 0 00:05:17.090 08:23:09 -- setup/common.sh@33 -- # return 0 00:05:17.090 08:23:09 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:17.090 08:23:09 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:17.090 08:23:09 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:17.090 08:23:09 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:17.090 08:23:09 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.090 08:23:09 -- setup/common.sh@18 -- # local node=1 00:05:17.091 08:23:09 -- setup/common.sh@19 -- # local var val 00:05:17.091 08:23:09 -- setup/common.sh@20 -- # local mem_f mem 00:05:17.091 08:23:09 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.091 08:23:09 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:17.091 08:23:09 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:17.091 08:23:09 -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.091 08:23:09 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698412 kB' 'MemFree: 15503260 kB' 'MemUsed: 12195152 kB' 'SwapCached: 0 kB' 'Active: 5640948 kB' 'Inactive: 3514368 kB' 'Active(anon): 5430304 kB' 'Inactive(anon): 0 kB' 'Active(file): 210644 kB' 'Inactive(file): 3514368 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8966412 kB' 'Mapped: 156572 kB' 'AnonPages: 189004 kB' 'Shmem: 5241400 kB' 'KernelStack: 9928 kB' 'PageTables: 3068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 152300 kB' 'Slab: 654564 kB' 'SReclaimable: 152300 kB' 'SUnreclaim: 502264 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.091 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.091 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.092 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.092 08:23:09 -- setup/common.sh@32 -- # continue 00:05:17.092 08:23:09 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.092 08:23:09 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.092 08:23:09 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.092 08:23:09 -- setup/common.sh@33 -- # echo 0 00:05:17.092 08:23:09 -- setup/common.sh@33 -- # return 0 00:05:17.092 08:23:09 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:17.092 08:23:09 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:17.092 08:23:09 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:17.092 08:23:09 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:17.092 08:23:09 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:17.092 node0=512 expecting 512 00:05:17.092 08:23:09 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:17.092 08:23:09 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:17.092 08:23:09 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:17.092 08:23:09 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:17.092 node1=512 expecting 512 00:05:17.092 08:23:09 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:17.092 00:05:17.092 real 0m3.514s 00:05:17.092 user 0m1.298s 00:05:17.092 sys 0m2.279s 00:05:17.092 08:23:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.092 08:23:09 -- common/autotest_common.sh@10 -- # set +x 00:05:17.092 ************************************ 00:05:17.092 END TEST even_2G_alloc 00:05:17.092 ************************************ 00:05:17.092 08:23:09 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:05:17.092 08:23:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:17.092 08:23:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:17.092 08:23:09 -- common/autotest_common.sh@10 -- # set +x 00:05:17.092 ************************************ 00:05:17.092 START TEST odd_alloc 00:05:17.092 ************************************ 00:05:17.092 08:23:09 -- common/autotest_common.sh@1104 -- # odd_alloc 00:05:17.092 08:23:09 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:05:17.092 08:23:09 -- setup/hugepages.sh@49 -- # local size=2098176 00:05:17.092 08:23:09 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:17.092 08:23:09 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:17.092 08:23:09 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:05:17.092 08:23:09 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:17.092 08:23:09 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:17.092 08:23:09 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:17.092 08:23:09 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:05:17.092 08:23:09 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:17.351 08:23:09 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:17.351 08:23:09 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:17.351 08:23:09 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:17.351 08:23:09 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:17.351 08:23:09 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:17.352 08:23:09 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:17.352 08:23:09 -- setup/hugepages.sh@83 -- # : 513 00:05:17.352 08:23:09 -- setup/hugepages.sh@84 -- # : 1 00:05:17.352 08:23:09 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:17.352 08:23:09 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:05:17.352 08:23:09 -- setup/hugepages.sh@83 -- # : 0 00:05:17.352 08:23:09 -- setup/hugepages.sh@84 -- # : 0 00:05:17.352 08:23:09 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:17.352 08:23:09 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:05:17.352 08:23:09 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:05:17.352 08:23:09 -- setup/hugepages.sh@160 -- # setup output 00:05:17.352 08:23:09 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.352 08:23:09 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:20.643 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:20.643 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:20.643 08:23:13 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:05:20.643 08:23:13 -- setup/hugepages.sh@89 -- # local node 00:05:20.643 08:23:13 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:20.643 08:23:13 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:20.643 08:23:13 -- setup/hugepages.sh@92 -- # local surp 00:05:20.643 08:23:13 -- setup/hugepages.sh@93 -- # local resv 00:05:20.643 08:23:13 -- setup/hugepages.sh@94 -- # local anon 00:05:20.643 08:23:13 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:20.643 08:23:13 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:20.643 08:23:13 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:20.643 08:23:13 -- setup/common.sh@18 -- # local node= 00:05:20.643 08:23:13 -- setup/common.sh@19 -- # local var val 00:05:20.643 08:23:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:20.643 08:23:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.643 08:23:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:20.643 08:23:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:20.643 08:23:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.643 08:23:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.643 08:23:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40448680 kB' 'MemAvailable: 44316136 kB' 'Buffers: 2708 kB' 'Cached: 13599324 kB' 'SwapCached: 0 kB' 'Active: 10339344 kB' 'Inactive: 3768020 kB' 'Active(anon): 9880540 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508584 kB' 'Mapped: 209272 kB' 'Shmem: 9375208 kB' 'KReclaimable: 274996 kB' 'Slab: 1240644 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 965648 kB' 'KernelStack: 21904 kB' 'PageTables: 8208 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480892 kB' 'Committed_AS: 11098580 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217072 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.643 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.643 08:23:13 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.644 08:23:13 -- setup/common.sh@33 -- # echo 0 00:05:20.644 08:23:13 -- setup/common.sh@33 -- # return 0 00:05:20.644 08:23:13 -- setup/hugepages.sh@97 -- # anon=0 00:05:20.644 08:23:13 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:20.644 08:23:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:20.644 08:23:13 -- setup/common.sh@18 -- # local node= 00:05:20.644 08:23:13 -- setup/common.sh@19 -- # local var val 00:05:20.644 08:23:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:20.644 08:23:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.644 08:23:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:20.644 08:23:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:20.644 08:23:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.644 08:23:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.644 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.644 08:23:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40450220 kB' 'MemAvailable: 44317676 kB' 'Buffers: 2708 kB' 'Cached: 13599328 kB' 'SwapCached: 0 kB' 'Active: 10339168 kB' 'Inactive: 3768020 kB' 'Active(anon): 9880364 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508872 kB' 'Mapped: 209224 kB' 'Shmem: 9375212 kB' 'KReclaimable: 274996 kB' 'Slab: 1240636 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 965640 kB' 'KernelStack: 22112 kB' 'PageTables: 8848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480892 kB' 'Committed_AS: 11098592 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217024 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.645 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.645 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.646 08:23:13 -- setup/common.sh@33 -- # echo 0 00:05:20.646 08:23:13 -- setup/common.sh@33 -- # return 0 00:05:20.646 08:23:13 -- setup/hugepages.sh@99 -- # surp=0 00:05:20.646 08:23:13 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:20.646 08:23:13 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:20.646 08:23:13 -- setup/common.sh@18 -- # local node= 00:05:20.646 08:23:13 -- setup/common.sh@19 -- # local var val 00:05:20.646 08:23:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:20.646 08:23:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.646 08:23:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:20.646 08:23:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:20.646 08:23:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.646 08:23:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.646 08:23:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40449404 kB' 'MemAvailable: 44316860 kB' 'Buffers: 2708 kB' 'Cached: 13599328 kB' 'SwapCached: 0 kB' 'Active: 10338920 kB' 'Inactive: 3768020 kB' 'Active(anon): 9880116 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508584 kB' 'Mapped: 209164 kB' 'Shmem: 9375212 kB' 'KReclaimable: 274996 kB' 'Slab: 1240624 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 965628 kB' 'KernelStack: 22144 kB' 'PageTables: 8784 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480892 kB' 'Committed_AS: 11098608 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217072 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.646 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.646 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.647 08:23:13 -- setup/common.sh@33 -- # echo 0 00:05:20.647 08:23:13 -- setup/common.sh@33 -- # return 0 00:05:20.647 08:23:13 -- setup/hugepages.sh@100 -- # resv=0 00:05:20.647 08:23:13 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:05:20.647 nr_hugepages=1025 00:05:20.647 08:23:13 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:20.647 resv_hugepages=0 00:05:20.647 08:23:13 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:20.647 surplus_hugepages=0 00:05:20.647 08:23:13 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:20.647 anon_hugepages=0 00:05:20.647 08:23:13 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:20.647 08:23:13 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:05:20.647 08:23:13 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:20.647 08:23:13 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:20.647 08:23:13 -- setup/common.sh@18 -- # local node= 00:05:20.647 08:23:13 -- setup/common.sh@19 -- # local var val 00:05:20.647 08:23:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:20.647 08:23:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.647 08:23:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:20.647 08:23:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:20.647 08:23:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.647 08:23:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40452236 kB' 'MemAvailable: 44319692 kB' 'Buffers: 2708 kB' 'Cached: 13599352 kB' 'SwapCached: 0 kB' 'Active: 10338484 kB' 'Inactive: 3768020 kB' 'Active(anon): 9879680 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508116 kB' 'Mapped: 209164 kB' 'Shmem: 9375236 kB' 'KReclaimable: 274996 kB' 'Slab: 1240560 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 965564 kB' 'KernelStack: 22080 kB' 'PageTables: 8660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480892 kB' 'Committed_AS: 11097116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217040 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.647 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.647 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.648 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.648 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.649 08:23:13 -- setup/common.sh@33 -- # echo 1025 00:05:20.649 08:23:13 -- setup/common.sh@33 -- # return 0 00:05:20.649 08:23:13 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:20.649 08:23:13 -- setup/hugepages.sh@112 -- # get_nodes 00:05:20.649 08:23:13 -- setup/hugepages.sh@27 -- # local node 00:05:20.649 08:23:13 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:20.649 08:23:13 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:20.649 08:23:13 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:20.649 08:23:13 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:05:20.649 08:23:13 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:20.649 08:23:13 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:20.649 08:23:13 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:20.649 08:23:13 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:20.649 08:23:13 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:20.649 08:23:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:20.649 08:23:13 -- setup/common.sh@18 -- # local node=0 00:05:20.649 08:23:13 -- setup/common.sh@19 -- # local var val 00:05:20.649 08:23:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:20.649 08:23:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.649 08:23:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:20.649 08:23:13 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:20.649 08:23:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.649 08:23:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 24894112 kB' 'MemUsed: 7691256 kB' 'SwapCached: 0 kB' 'Active: 4698600 kB' 'Inactive: 253652 kB' 'Active(anon): 4450440 kB' 'Inactive(anon): 0 kB' 'Active(file): 248160 kB' 'Inactive(file): 253652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4635564 kB' 'Mapped: 52576 kB' 'AnonPages: 320160 kB' 'Shmem: 4133752 kB' 'KernelStack: 12216 kB' 'PageTables: 5772 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122792 kB' 'Slab: 585976 kB' 'SReclaimable: 122792 kB' 'SUnreclaim: 463184 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.649 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.649 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@33 -- # echo 0 00:05:20.650 08:23:13 -- setup/common.sh@33 -- # return 0 00:05:20.650 08:23:13 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:20.650 08:23:13 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:20.650 08:23:13 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:20.650 08:23:13 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:20.650 08:23:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:20.650 08:23:13 -- setup/common.sh@18 -- # local node=1 00:05:20.650 08:23:13 -- setup/common.sh@19 -- # local var val 00:05:20.650 08:23:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:20.650 08:23:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.650 08:23:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:20.650 08:23:13 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:20.650 08:23:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.650 08:23:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698412 kB' 'MemFree: 15558684 kB' 'MemUsed: 12139728 kB' 'SwapCached: 0 kB' 'Active: 5640692 kB' 'Inactive: 3514368 kB' 'Active(anon): 5430048 kB' 'Inactive(anon): 0 kB' 'Active(file): 210644 kB' 'Inactive(file): 3514368 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8966512 kB' 'Mapped: 156588 kB' 'AnonPages: 188784 kB' 'Shmem: 5241500 kB' 'KernelStack: 9976 kB' 'PageTables: 3232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 152204 kB' 'Slab: 654456 kB' 'SReclaimable: 152204 kB' 'SUnreclaim: 502252 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.650 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.650 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.651 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.651 08:23:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.910 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.910 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.910 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.910 08:23:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.910 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.910 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.910 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.911 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.911 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.911 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.911 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.911 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.911 08:23:13 -- setup/common.sh@32 -- # continue 00:05:20.911 08:23:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:20.911 08:23:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:20.911 08:23:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.911 08:23:13 -- setup/common.sh@33 -- # echo 0 00:05:20.911 08:23:13 -- setup/common.sh@33 -- # return 0 00:05:20.911 08:23:13 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:20.911 08:23:13 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:20.911 08:23:13 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:20.911 08:23:13 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:05:20.911 node0=512 expecting 513 00:05:20.911 08:23:13 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:20.911 08:23:13 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:20.911 08:23:13 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:20.911 08:23:13 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:05:20.911 node1=513 expecting 512 00:05:20.911 08:23:13 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:20.911 00:05:20.911 real 0m3.568s 00:05:20.911 user 0m1.403s 00:05:20.911 sys 0m2.230s 00:05:20.911 08:23:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:20.911 08:23:13 -- common/autotest_common.sh@10 -- # set +x 00:05:20.911 ************************************ 00:05:20.911 END TEST odd_alloc 00:05:20.911 ************************************ 00:05:20.911 08:23:13 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:05:20.911 08:23:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:20.911 08:23:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:20.911 08:23:13 -- common/autotest_common.sh@10 -- # set +x 00:05:20.911 ************************************ 00:05:20.911 START TEST custom_alloc 00:05:20.911 ************************************ 00:05:20.911 08:23:13 -- common/autotest_common.sh@1104 -- # custom_alloc 00:05:20.911 08:23:13 -- setup/hugepages.sh@167 -- # local IFS=, 00:05:20.911 08:23:13 -- setup/hugepages.sh@169 -- # local node 00:05:20.911 08:23:13 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:05:20.911 08:23:13 -- setup/hugepages.sh@170 -- # local nodes_hp 00:05:20.911 08:23:13 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:20.911 08:23:13 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:05:20.911 08:23:13 -- setup/hugepages.sh@49 -- # local size=1048576 00:05:20.911 08:23:13 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:20.911 08:23:13 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:20.911 08:23:13 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:20.911 08:23:13 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:20.911 08:23:13 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:20.911 08:23:13 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:20.911 08:23:13 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:20.911 08:23:13 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:20.911 08:23:13 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:20.911 08:23:13 -- setup/hugepages.sh@83 -- # : 256 00:05:20.911 08:23:13 -- setup/hugepages.sh@84 -- # : 1 00:05:20.911 08:23:13 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:20.911 08:23:13 -- setup/hugepages.sh@83 -- # : 0 00:05:20.911 08:23:13 -- setup/hugepages.sh@84 -- # : 0 00:05:20.911 08:23:13 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:05:20.911 08:23:13 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:05:20.911 08:23:13 -- setup/hugepages.sh@49 -- # local size=2097152 00:05:20.911 08:23:13 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:20.911 08:23:13 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:20.911 08:23:13 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:20.911 08:23:13 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:20.911 08:23:13 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:20.911 08:23:13 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:20.911 08:23:13 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:20.911 08:23:13 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:20.911 08:23:13 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:20.911 08:23:13 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:20.911 08:23:13 -- setup/hugepages.sh@78 -- # return 0 00:05:20.911 08:23:13 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:05:20.911 08:23:13 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:20.911 08:23:13 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:20.911 08:23:13 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:20.911 08:23:13 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:20.911 08:23:13 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:05:20.911 08:23:13 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:20.911 08:23:13 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:20.911 08:23:13 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:20.911 08:23:13 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:20.911 08:23:13 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:20.911 08:23:13 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:20.911 08:23:13 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:05:20.911 08:23:13 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:20.911 08:23:13 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:20.911 08:23:13 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:20.911 08:23:13 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:05:20.911 08:23:13 -- setup/hugepages.sh@78 -- # return 0 00:05:20.911 08:23:13 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:20.911 08:23:13 -- setup/hugepages.sh@187 -- # setup output 00:05:20.911 08:23:13 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:20.911 08:23:13 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:24.201 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:24.201 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:24.201 08:23:16 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:24.201 08:23:16 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:24.201 08:23:16 -- setup/hugepages.sh@89 -- # local node 00:05:24.201 08:23:16 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:24.201 08:23:16 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:24.201 08:23:16 -- setup/hugepages.sh@92 -- # local surp 00:05:24.201 08:23:16 -- setup/hugepages.sh@93 -- # local resv 00:05:24.202 08:23:16 -- setup/hugepages.sh@94 -- # local anon 00:05:24.202 08:23:16 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:24.202 08:23:16 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:24.202 08:23:16 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:24.202 08:23:16 -- setup/common.sh@18 -- # local node= 00:05:24.202 08:23:16 -- setup/common.sh@19 -- # local var val 00:05:24.202 08:23:16 -- setup/common.sh@20 -- # local mem_f mem 00:05:24.202 08:23:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.202 08:23:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:24.202 08:23:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:24.202 08:23:16 -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.202 08:23:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 39424092 kB' 'MemAvailable: 43291548 kB' 'Buffers: 2708 kB' 'Cached: 13599464 kB' 'SwapCached: 0 kB' 'Active: 10339400 kB' 'Inactive: 3768020 kB' 'Active(anon): 9880596 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508492 kB' 'Mapped: 209232 kB' 'Shmem: 9375348 kB' 'KReclaimable: 274996 kB' 'Slab: 1240632 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 965636 kB' 'KernelStack: 21808 kB' 'PageTables: 8336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957628 kB' 'Committed_AS: 11094948 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217024 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.202 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.202 08:23:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.202 08:23:16 -- setup/common.sh@33 -- # echo 0 00:05:24.202 08:23:16 -- setup/common.sh@33 -- # return 0 00:05:24.202 08:23:16 -- setup/hugepages.sh@97 -- # anon=0 00:05:24.202 08:23:16 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:24.203 08:23:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:24.203 08:23:16 -- setup/common.sh@18 -- # local node= 00:05:24.203 08:23:16 -- setup/common.sh@19 -- # local var val 00:05:24.203 08:23:16 -- setup/common.sh@20 -- # local mem_f mem 00:05:24.203 08:23:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.203 08:23:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:24.203 08:23:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:24.203 08:23:16 -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.203 08:23:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 39424732 kB' 'MemAvailable: 43292188 kB' 'Buffers: 2708 kB' 'Cached: 13599468 kB' 'SwapCached: 0 kB' 'Active: 10338696 kB' 'Inactive: 3768020 kB' 'Active(anon): 9879892 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 507740 kB' 'Mapped: 209232 kB' 'Shmem: 9375352 kB' 'KReclaimable: 274996 kB' 'Slab: 1240648 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 965652 kB' 'KernelStack: 21792 kB' 'PageTables: 8296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957628 kB' 'Committed_AS: 11094960 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217008 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.203 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.203 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.204 08:23:16 -- setup/common.sh@33 -- # echo 0 00:05:24.204 08:23:16 -- setup/common.sh@33 -- # return 0 00:05:24.204 08:23:16 -- setup/hugepages.sh@99 -- # surp=0 00:05:24.204 08:23:16 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:24.204 08:23:16 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:24.204 08:23:16 -- setup/common.sh@18 -- # local node= 00:05:24.204 08:23:16 -- setup/common.sh@19 -- # local var val 00:05:24.204 08:23:16 -- setup/common.sh@20 -- # local mem_f mem 00:05:24.204 08:23:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.204 08:23:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:24.204 08:23:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:24.204 08:23:16 -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.204 08:23:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 39425196 kB' 'MemAvailable: 43292652 kB' 'Buffers: 2708 kB' 'Cached: 13599480 kB' 'SwapCached: 0 kB' 'Active: 10338712 kB' 'Inactive: 3768020 kB' 'Active(anon): 9879908 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 507740 kB' 'Mapped: 209232 kB' 'Shmem: 9375364 kB' 'KReclaimable: 274996 kB' 'Slab: 1240648 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 965652 kB' 'KernelStack: 21792 kB' 'PageTables: 8296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957628 kB' 'Committed_AS: 11094976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217008 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.204 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.204 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.205 08:23:16 -- setup/common.sh@33 -- # echo 0 00:05:24.205 08:23:16 -- setup/common.sh@33 -- # return 0 00:05:24.205 08:23:16 -- setup/hugepages.sh@100 -- # resv=0 00:05:24.205 08:23:16 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:24.205 nr_hugepages=1536 00:05:24.205 08:23:16 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:24.205 resv_hugepages=0 00:05:24.205 08:23:16 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:24.205 surplus_hugepages=0 00:05:24.205 08:23:16 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:24.205 anon_hugepages=0 00:05:24.205 08:23:16 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:24.205 08:23:16 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:24.205 08:23:16 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:24.205 08:23:16 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:24.205 08:23:16 -- setup/common.sh@18 -- # local node= 00:05:24.205 08:23:16 -- setup/common.sh@19 -- # local var val 00:05:24.205 08:23:16 -- setup/common.sh@20 -- # local mem_f mem 00:05:24.205 08:23:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.205 08:23:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:24.205 08:23:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:24.205 08:23:16 -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.205 08:23:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 39425344 kB' 'MemAvailable: 43292800 kB' 'Buffers: 2708 kB' 'Cached: 13599504 kB' 'SwapCached: 0 kB' 'Active: 10338372 kB' 'Inactive: 3768020 kB' 'Active(anon): 9879568 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 507352 kB' 'Mapped: 209232 kB' 'Shmem: 9375388 kB' 'KReclaimable: 274996 kB' 'Slab: 1240652 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 965656 kB' 'KernelStack: 21776 kB' 'PageTables: 8244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957628 kB' 'Committed_AS: 11094992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217008 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.205 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.205 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.206 08:23:16 -- setup/common.sh@33 -- # echo 1536 00:05:24.206 08:23:16 -- setup/common.sh@33 -- # return 0 00:05:24.206 08:23:16 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:24.206 08:23:16 -- setup/hugepages.sh@112 -- # get_nodes 00:05:24.206 08:23:16 -- setup/hugepages.sh@27 -- # local node 00:05:24.206 08:23:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:24.206 08:23:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:24.206 08:23:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:24.206 08:23:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:24.206 08:23:16 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:24.206 08:23:16 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:24.206 08:23:16 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:24.206 08:23:16 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:24.206 08:23:16 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:24.206 08:23:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:24.206 08:23:16 -- setup/common.sh@18 -- # local node=0 00:05:24.206 08:23:16 -- setup/common.sh@19 -- # local var val 00:05:24.206 08:23:16 -- setup/common.sh@20 -- # local mem_f mem 00:05:24.206 08:23:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.206 08:23:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:24.206 08:23:16 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:24.206 08:23:16 -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.206 08:23:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 24898700 kB' 'MemUsed: 7686668 kB' 'SwapCached: 0 kB' 'Active: 4698256 kB' 'Inactive: 253652 kB' 'Active(anon): 4450096 kB' 'Inactive(anon): 0 kB' 'Active(file): 248160 kB' 'Inactive(file): 253652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4635576 kB' 'Mapped: 52656 kB' 'AnonPages: 319484 kB' 'Shmem: 4133764 kB' 'KernelStack: 11864 kB' 'PageTables: 5260 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122792 kB' 'Slab: 586192 kB' 'SReclaimable: 122792 kB' 'SUnreclaim: 463400 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@33 -- # echo 0 00:05:24.206 08:23:16 -- setup/common.sh@33 -- # return 0 00:05:24.206 08:23:16 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:24.206 08:23:16 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:24.206 08:23:16 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:24.206 08:23:16 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:24.206 08:23:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:24.206 08:23:16 -- setup/common.sh@18 -- # local node=1 00:05:24.206 08:23:16 -- setup/common.sh@19 -- # local var val 00:05:24.206 08:23:16 -- setup/common.sh@20 -- # local mem_f mem 00:05:24.206 08:23:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.206 08:23:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:24.206 08:23:16 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:24.206 08:23:16 -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.206 08:23:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.206 08:23:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698412 kB' 'MemFree: 14526644 kB' 'MemUsed: 13171768 kB' 'SwapCached: 0 kB' 'Active: 5640156 kB' 'Inactive: 3514368 kB' 'Active(anon): 5429512 kB' 'Inactive(anon): 0 kB' 'Active(file): 210644 kB' 'Inactive(file): 3514368 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8966664 kB' 'Mapped: 156576 kB' 'AnonPages: 187880 kB' 'Shmem: 5241652 kB' 'KernelStack: 9912 kB' 'PageTables: 2984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 152204 kB' 'Slab: 654460 kB' 'SReclaimable: 152204 kB' 'SUnreclaim: 502256 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.206 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.206 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # continue 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # IFS=': ' 00:05:24.207 08:23:16 -- setup/common.sh@31 -- # read -r var val _ 00:05:24.207 08:23:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.207 08:23:16 -- setup/common.sh@33 -- # echo 0 00:05:24.207 08:23:16 -- setup/common.sh@33 -- # return 0 00:05:24.207 08:23:16 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:24.207 08:23:16 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:24.207 08:23:16 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:24.207 08:23:16 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:24.207 08:23:16 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:24.207 node0=512 expecting 512 00:05:24.207 08:23:16 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:24.207 08:23:16 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:24.207 08:23:16 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:24.207 08:23:16 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:24.207 node1=1024 expecting 1024 00:05:24.207 08:23:16 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:24.207 00:05:24.207 real 0m3.459s 00:05:24.207 user 0m1.221s 00:05:24.207 sys 0m2.190s 00:05:24.207 08:23:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:24.207 08:23:16 -- common/autotest_common.sh@10 -- # set +x 00:05:24.207 ************************************ 00:05:24.207 END TEST custom_alloc 00:05:24.207 ************************************ 00:05:24.207 08:23:16 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:24.207 08:23:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:24.207 08:23:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:24.207 08:23:16 -- common/autotest_common.sh@10 -- # set +x 00:05:24.466 ************************************ 00:05:24.466 START TEST no_shrink_alloc 00:05:24.466 ************************************ 00:05:24.466 08:23:16 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:05:24.466 08:23:16 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:24.466 08:23:16 -- setup/hugepages.sh@49 -- # local size=2097152 00:05:24.466 08:23:16 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:24.466 08:23:16 -- setup/hugepages.sh@51 -- # shift 00:05:24.466 08:23:16 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:24.466 08:23:16 -- setup/hugepages.sh@52 -- # local node_ids 00:05:24.466 08:23:16 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:24.466 08:23:16 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:24.466 08:23:16 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:24.466 08:23:16 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:24.466 08:23:16 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:24.466 08:23:16 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:24.466 08:23:16 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:24.466 08:23:16 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:24.466 08:23:16 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:24.466 08:23:16 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:24.466 08:23:16 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:24.466 08:23:16 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:24.466 08:23:16 -- setup/hugepages.sh@73 -- # return 0 00:05:24.466 08:23:16 -- setup/hugepages.sh@198 -- # setup output 00:05:24.467 08:23:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:24.467 08:23:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:27.762 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:27.762 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:27.762 08:23:20 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:27.762 08:23:20 -- setup/hugepages.sh@89 -- # local node 00:05:27.762 08:23:20 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:27.762 08:23:20 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:27.762 08:23:20 -- setup/hugepages.sh@92 -- # local surp 00:05:27.762 08:23:20 -- setup/hugepages.sh@93 -- # local resv 00:05:27.762 08:23:20 -- setup/hugepages.sh@94 -- # local anon 00:05:27.762 08:23:20 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:27.762 08:23:20 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:27.762 08:23:20 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:27.762 08:23:20 -- setup/common.sh@18 -- # local node= 00:05:27.762 08:23:20 -- setup/common.sh@19 -- # local var val 00:05:27.762 08:23:20 -- setup/common.sh@20 -- # local mem_f mem 00:05:27.762 08:23:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:27.762 08:23:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:27.762 08:23:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:27.762 08:23:20 -- setup/common.sh@28 -- # mapfile -t mem 00:05:27.762 08:23:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:27.762 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.762 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.762 08:23:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40460692 kB' 'MemAvailable: 44328148 kB' 'Buffers: 2708 kB' 'Cached: 13599592 kB' 'SwapCached: 0 kB' 'Active: 10342788 kB' 'Inactive: 3768020 kB' 'Active(anon): 9883984 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512180 kB' 'Mapped: 209240 kB' 'Shmem: 9375476 kB' 'KReclaimable: 274996 kB' 'Slab: 1241296 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 966300 kB' 'KernelStack: 21808 kB' 'PageTables: 8348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11095604 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216944 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:27.762 08:23:20 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.762 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.762 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.762 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.762 08:23:20 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.762 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.762 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.762 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.762 08:23:20 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.762 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.762 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.762 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.763 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.763 08:23:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:27.763 08:23:20 -- setup/common.sh@33 -- # echo 0 00:05:27.763 08:23:20 -- setup/common.sh@33 -- # return 0 00:05:27.763 08:23:20 -- setup/hugepages.sh@97 -- # anon=0 00:05:27.763 08:23:20 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:27.763 08:23:20 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:27.763 08:23:20 -- setup/common.sh@18 -- # local node= 00:05:27.763 08:23:20 -- setup/common.sh@19 -- # local var val 00:05:27.763 08:23:20 -- setup/common.sh@20 -- # local mem_f mem 00:05:27.763 08:23:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:27.763 08:23:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:27.763 08:23:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:27.763 08:23:20 -- setup/common.sh@28 -- # mapfile -t mem 00:05:27.764 08:23:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40462264 kB' 'MemAvailable: 44329720 kB' 'Buffers: 2708 kB' 'Cached: 13599596 kB' 'SwapCached: 0 kB' 'Active: 10342484 kB' 'Inactive: 3768020 kB' 'Active(anon): 9883680 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511804 kB' 'Mapped: 209240 kB' 'Shmem: 9375480 kB' 'KReclaimable: 274996 kB' 'Slab: 1241304 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 966308 kB' 'KernelStack: 21792 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11095616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216928 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.764 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.764 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.765 08:23:20 -- setup/common.sh@33 -- # echo 0 00:05:27.765 08:23:20 -- setup/common.sh@33 -- # return 0 00:05:27.765 08:23:20 -- setup/hugepages.sh@99 -- # surp=0 00:05:27.765 08:23:20 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:27.765 08:23:20 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:27.765 08:23:20 -- setup/common.sh@18 -- # local node= 00:05:27.765 08:23:20 -- setup/common.sh@19 -- # local var val 00:05:27.765 08:23:20 -- setup/common.sh@20 -- # local mem_f mem 00:05:27.765 08:23:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:27.765 08:23:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:27.765 08:23:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:27.765 08:23:20 -- setup/common.sh@28 -- # mapfile -t mem 00:05:27.765 08:23:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40463076 kB' 'MemAvailable: 44330532 kB' 'Buffers: 2708 kB' 'Cached: 13599608 kB' 'SwapCached: 0 kB' 'Active: 10342380 kB' 'Inactive: 3768020 kB' 'Active(anon): 9883576 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511672 kB' 'Mapped: 209240 kB' 'Shmem: 9375492 kB' 'KReclaimable: 274996 kB' 'Slab: 1241364 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 966368 kB' 'KernelStack: 21792 kB' 'PageTables: 8300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11095632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216928 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.765 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.765 08:23:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.766 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.766 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:27.767 08:23:20 -- setup/common.sh@33 -- # echo 0 00:05:27.767 08:23:20 -- setup/common.sh@33 -- # return 0 00:05:27.767 08:23:20 -- setup/hugepages.sh@100 -- # resv=0 00:05:27.767 08:23:20 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:27.767 nr_hugepages=1024 00:05:27.767 08:23:20 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:27.767 resv_hugepages=0 00:05:27.767 08:23:20 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:27.767 surplus_hugepages=0 00:05:27.767 08:23:20 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:27.767 anon_hugepages=0 00:05:27.767 08:23:20 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:27.767 08:23:20 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:27.767 08:23:20 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:27.767 08:23:20 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:27.767 08:23:20 -- setup/common.sh@18 -- # local node= 00:05:27.767 08:23:20 -- setup/common.sh@19 -- # local var val 00:05:27.767 08:23:20 -- setup/common.sh@20 -- # local mem_f mem 00:05:27.767 08:23:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:27.767 08:23:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:27.767 08:23:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:27.767 08:23:20 -- setup/common.sh@28 -- # mapfile -t mem 00:05:27.767 08:23:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40463184 kB' 'MemAvailable: 44330640 kB' 'Buffers: 2708 kB' 'Cached: 13599620 kB' 'SwapCached: 0 kB' 'Active: 10342608 kB' 'Inactive: 3768020 kB' 'Active(anon): 9883804 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511908 kB' 'Mapped: 209240 kB' 'Shmem: 9375504 kB' 'KReclaimable: 274996 kB' 'Slab: 1241364 kB' 'SReclaimable: 274996 kB' 'SUnreclaim: 966368 kB' 'KernelStack: 21792 kB' 'PageTables: 8304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11095648 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216928 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.767 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.767 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:27.768 08:23:20 -- setup/common.sh@33 -- # echo 1024 00:05:27.768 08:23:20 -- setup/common.sh@33 -- # return 0 00:05:27.768 08:23:20 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:27.768 08:23:20 -- setup/hugepages.sh@112 -- # get_nodes 00:05:27.768 08:23:20 -- setup/hugepages.sh@27 -- # local node 00:05:27.768 08:23:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:27.768 08:23:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:27.768 08:23:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:27.768 08:23:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:27.768 08:23:20 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:27.768 08:23:20 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:27.768 08:23:20 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:27.768 08:23:20 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:27.768 08:23:20 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:27.768 08:23:20 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:27.768 08:23:20 -- setup/common.sh@18 -- # local node=0 00:05:27.768 08:23:20 -- setup/common.sh@19 -- # local var val 00:05:27.768 08:23:20 -- setup/common.sh@20 -- # local mem_f mem 00:05:27.768 08:23:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:27.768 08:23:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:27.768 08:23:20 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:27.768 08:23:20 -- setup/common.sh@28 -- # mapfile -t mem 00:05:27.768 08:23:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 23836092 kB' 'MemUsed: 8749276 kB' 'SwapCached: 0 kB' 'Active: 4699804 kB' 'Inactive: 253652 kB' 'Active(anon): 4451644 kB' 'Inactive(anon): 0 kB' 'Active(file): 248160 kB' 'Inactive(file): 253652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4635624 kB' 'Mapped: 52664 kB' 'AnonPages: 321196 kB' 'Shmem: 4133812 kB' 'KernelStack: 11880 kB' 'PageTables: 5316 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122792 kB' 'Slab: 586388 kB' 'SReclaimable: 122792 kB' 'SUnreclaim: 463596 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.768 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.768 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # continue 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # IFS=': ' 00:05:27.769 08:23:20 -- setup/common.sh@31 -- # read -r var val _ 00:05:27.769 08:23:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:27.769 08:23:20 -- setup/common.sh@33 -- # echo 0 00:05:27.769 08:23:20 -- setup/common.sh@33 -- # return 0 00:05:27.769 08:23:20 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:27.769 08:23:20 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:27.769 08:23:20 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:27.769 08:23:20 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:27.769 08:23:20 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:27.769 node0=1024 expecting 1024 00:05:27.769 08:23:20 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:27.769 08:23:20 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:27.769 08:23:20 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:27.769 08:23:20 -- setup/hugepages.sh@202 -- # setup output 00:05:27.769 08:23:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:27.769 08:23:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:31.962 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:31.962 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:31.962 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:31.962 08:23:23 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:31.962 08:23:23 -- setup/hugepages.sh@89 -- # local node 00:05:31.962 08:23:23 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:31.962 08:23:23 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:31.962 08:23:23 -- setup/hugepages.sh@92 -- # local surp 00:05:31.962 08:23:23 -- setup/hugepages.sh@93 -- # local resv 00:05:31.962 08:23:23 -- setup/hugepages.sh@94 -- # local anon 00:05:31.962 08:23:23 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:31.962 08:23:23 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:31.962 08:23:23 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:31.962 08:23:23 -- setup/common.sh@18 -- # local node= 00:05:31.962 08:23:23 -- setup/common.sh@19 -- # local var val 00:05:31.962 08:23:23 -- setup/common.sh@20 -- # local mem_f mem 00:05:31.962 08:23:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:31.962 08:23:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:31.962 08:23:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:31.962 08:23:23 -- setup/common.sh@28 -- # mapfile -t mem 00:05:31.962 08:23:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40463416 kB' 'MemAvailable: 44330868 kB' 'Buffers: 2708 kB' 'Cached: 13599712 kB' 'SwapCached: 0 kB' 'Active: 10339840 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881036 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508680 kB' 'Mapped: 209256 kB' 'Shmem: 9375596 kB' 'KReclaimable: 274988 kB' 'Slab: 1241432 kB' 'SReclaimable: 274988 kB' 'SUnreclaim: 966444 kB' 'KernelStack: 21872 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11100600 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217056 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.962 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.962 08:23:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.963 08:23:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:31.963 08:23:23 -- setup/common.sh@33 -- # echo 0 00:05:31.963 08:23:23 -- setup/common.sh@33 -- # return 0 00:05:31.963 08:23:23 -- setup/hugepages.sh@97 -- # anon=0 00:05:31.963 08:23:23 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:31.963 08:23:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:31.963 08:23:23 -- setup/common.sh@18 -- # local node= 00:05:31.963 08:23:23 -- setup/common.sh@19 -- # local var val 00:05:31.963 08:23:23 -- setup/common.sh@20 -- # local mem_f mem 00:05:31.963 08:23:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:31.963 08:23:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:31.963 08:23:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:31.963 08:23:23 -- setup/common.sh@28 -- # mapfile -t mem 00:05:31.963 08:23:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:31.963 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40464300 kB' 'MemAvailable: 44331752 kB' 'Buffers: 2708 kB' 'Cached: 13599716 kB' 'SwapCached: 0 kB' 'Active: 10340104 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881300 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509416 kB' 'Mapped: 209256 kB' 'Shmem: 9375600 kB' 'KReclaimable: 274988 kB' 'Slab: 1241432 kB' 'SReclaimable: 274988 kB' 'SUnreclaim: 966444 kB' 'KernelStack: 21760 kB' 'PageTables: 8684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11113000 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217024 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:23 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.964 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.964 08:23:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.965 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.965 08:23:24 -- setup/common.sh@33 -- # echo 0 00:05:31.965 08:23:24 -- setup/common.sh@33 -- # return 0 00:05:31.965 08:23:24 -- setup/hugepages.sh@99 -- # surp=0 00:05:31.965 08:23:24 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:31.965 08:23:24 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:31.965 08:23:24 -- setup/common.sh@18 -- # local node= 00:05:31.965 08:23:24 -- setup/common.sh@19 -- # local var val 00:05:31.965 08:23:24 -- setup/common.sh@20 -- # local mem_f mem 00:05:31.965 08:23:24 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:31.965 08:23:24 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:31.965 08:23:24 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:31.965 08:23:24 -- setup/common.sh@28 -- # mapfile -t mem 00:05:31.965 08:23:24 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.965 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40467848 kB' 'MemAvailable: 44335300 kB' 'Buffers: 2708 kB' 'Cached: 13599736 kB' 'SwapCached: 0 kB' 'Active: 10340384 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881580 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509256 kB' 'Mapped: 209256 kB' 'Shmem: 9375620 kB' 'KReclaimable: 274988 kB' 'Slab: 1241472 kB' 'SReclaimable: 274988 kB' 'SUnreclaim: 966484 kB' 'KernelStack: 21952 kB' 'PageTables: 8436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11100440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216992 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.966 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.966 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:31.967 08:23:24 -- setup/common.sh@33 -- # echo 0 00:05:31.967 08:23:24 -- setup/common.sh@33 -- # return 0 00:05:31.967 08:23:24 -- setup/hugepages.sh@100 -- # resv=0 00:05:31.967 08:23:24 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:31.967 nr_hugepages=1024 00:05:31.967 08:23:24 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:31.967 resv_hugepages=0 00:05:31.967 08:23:24 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:31.967 surplus_hugepages=0 00:05:31.967 08:23:24 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:31.967 anon_hugepages=0 00:05:31.967 08:23:24 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:31.967 08:23:24 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:31.967 08:23:24 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:31.967 08:23:24 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:31.967 08:23:24 -- setup/common.sh@18 -- # local node= 00:05:31.967 08:23:24 -- setup/common.sh@19 -- # local var val 00:05:31.967 08:23:24 -- setup/common.sh@20 -- # local mem_f mem 00:05:31.967 08:23:24 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:31.967 08:23:24 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:31.967 08:23:24 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:31.967 08:23:24 -- setup/common.sh@28 -- # mapfile -t mem 00:05:31.967 08:23:24 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283780 kB' 'MemFree: 40467516 kB' 'MemAvailable: 44334968 kB' 'Buffers: 2708 kB' 'Cached: 13599752 kB' 'SwapCached: 0 kB' 'Active: 10340100 kB' 'Inactive: 3768020 kB' 'Active(anon): 9881296 kB' 'Inactive(anon): 0 kB' 'Active(file): 458804 kB' 'Inactive(file): 3768020 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508884 kB' 'Mapped: 209256 kB' 'Shmem: 9375636 kB' 'KReclaimable: 274988 kB' 'Slab: 1241480 kB' 'SReclaimable: 274988 kB' 'SUnreclaim: 966492 kB' 'KernelStack: 21920 kB' 'PageTables: 8716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 11100596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217024 kB' 'VmallocChunk: 0 kB' 'Percpu: 85568 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3226996 kB' 'DirectMap2M: 12187648 kB' 'DirectMap1G: 54525952 kB' 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.967 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.967 08:23:24 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.968 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.968 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:31.969 08:23:24 -- setup/common.sh@33 -- # echo 1024 00:05:31.969 08:23:24 -- setup/common.sh@33 -- # return 0 00:05:31.969 08:23:24 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:31.969 08:23:24 -- setup/hugepages.sh@112 -- # get_nodes 00:05:31.969 08:23:24 -- setup/hugepages.sh@27 -- # local node 00:05:31.969 08:23:24 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:31.969 08:23:24 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:31.969 08:23:24 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:31.969 08:23:24 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:31.969 08:23:24 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:31.969 08:23:24 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:31.969 08:23:24 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:31.969 08:23:24 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:31.969 08:23:24 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:31.969 08:23:24 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:31.969 08:23:24 -- setup/common.sh@18 -- # local node=0 00:05:31.969 08:23:24 -- setup/common.sh@19 -- # local var val 00:05:31.969 08:23:24 -- setup/common.sh@20 -- # local mem_f mem 00:05:31.969 08:23:24 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:31.969 08:23:24 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:31.969 08:23:24 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:31.969 08:23:24 -- setup/common.sh@28 -- # mapfile -t mem 00:05:31.969 08:23:24 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 23834348 kB' 'MemUsed: 8751020 kB' 'SwapCached: 0 kB' 'Active: 4700480 kB' 'Inactive: 253652 kB' 'Active(anon): 4452320 kB' 'Inactive(anon): 0 kB' 'Active(file): 248160 kB' 'Inactive(file): 253652 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4635744 kB' 'Mapped: 52668 kB' 'AnonPages: 321592 kB' 'Shmem: 4133932 kB' 'KernelStack: 11928 kB' 'PageTables: 5524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122784 kB' 'Slab: 586460 kB' 'SReclaimable: 122784 kB' 'SUnreclaim: 463676 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.969 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.969 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.970 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.970 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.971 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.971 08:23:24 -- setup/common.sh@32 -- # continue 00:05:31.971 08:23:24 -- setup/common.sh@31 -- # IFS=': ' 00:05:31.971 08:23:24 -- setup/common.sh@31 -- # read -r var val _ 00:05:31.971 08:23:24 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:31.971 08:23:24 -- setup/common.sh@33 -- # echo 0 00:05:31.971 08:23:24 -- setup/common.sh@33 -- # return 0 00:05:31.971 08:23:24 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:31.971 08:23:24 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:31.971 08:23:24 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:31.971 08:23:24 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:31.971 08:23:24 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:31.971 node0=1024 expecting 1024 00:05:31.971 08:23:24 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:31.971 00:05:31.971 real 0m7.235s 00:05:31.971 user 0m2.697s 00:05:31.971 sys 0m4.676s 00:05:31.971 08:23:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:31.971 08:23:24 -- common/autotest_common.sh@10 -- # set +x 00:05:31.971 ************************************ 00:05:31.971 END TEST no_shrink_alloc 00:05:31.971 ************************************ 00:05:31.971 08:23:24 -- setup/hugepages.sh@217 -- # clear_hp 00:05:31.971 08:23:24 -- setup/hugepages.sh@37 -- # local node hp 00:05:31.971 08:23:24 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:31.971 08:23:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:31.971 08:23:24 -- setup/hugepages.sh@41 -- # echo 0 00:05:31.971 08:23:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:31.971 08:23:24 -- setup/hugepages.sh@41 -- # echo 0 00:05:31.971 08:23:24 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:31.971 08:23:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:31.971 08:23:24 -- setup/hugepages.sh@41 -- # echo 0 00:05:31.971 08:23:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:31.971 08:23:24 -- setup/hugepages.sh@41 -- # echo 0 00:05:31.971 08:23:24 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:31.971 08:23:24 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:31.971 00:05:31.971 real 0m27.319s 00:05:31.971 user 0m9.629s 00:05:31.971 sys 0m16.521s 00:05:31.971 08:23:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:31.971 08:23:24 -- common/autotest_common.sh@10 -- # set +x 00:05:31.971 ************************************ 00:05:31.971 END TEST hugepages 00:05:31.971 ************************************ 00:05:31.971 08:23:24 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:31.971 08:23:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:31.971 08:23:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:31.971 08:23:24 -- common/autotest_common.sh@10 -- # set +x 00:05:31.971 ************************************ 00:05:31.971 START TEST driver 00:05:31.971 ************************************ 00:05:31.971 08:23:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:31.971 * Looking for test storage... 00:05:31.971 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:31.971 08:23:24 -- setup/driver.sh@68 -- # setup reset 00:05:31.971 08:23:24 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:31.971 08:23:24 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:37.241 08:23:28 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:37.241 08:23:28 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.241 08:23:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.241 08:23:28 -- common/autotest_common.sh@10 -- # set +x 00:05:37.241 ************************************ 00:05:37.241 START TEST guess_driver 00:05:37.241 ************************************ 00:05:37.241 08:23:28 -- common/autotest_common.sh@1104 -- # guess_driver 00:05:37.241 08:23:28 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:37.241 08:23:28 -- setup/driver.sh@47 -- # local fail=0 00:05:37.241 08:23:28 -- setup/driver.sh@49 -- # pick_driver 00:05:37.241 08:23:28 -- setup/driver.sh@36 -- # vfio 00:05:37.241 08:23:28 -- setup/driver.sh@21 -- # local iommu_grups 00:05:37.241 08:23:28 -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:37.241 08:23:28 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:37.241 08:23:28 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:37.241 08:23:28 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:37.241 08:23:28 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:37.241 08:23:28 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:37.241 08:23:28 -- setup/driver.sh@14 -- # mod vfio_pci 00:05:37.241 08:23:28 -- setup/driver.sh@12 -- # dep vfio_pci 00:05:37.241 08:23:28 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:37.241 08:23:28 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:37.241 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:37.241 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:37.241 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:37.241 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:37.241 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:37.241 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:37.241 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:37.241 08:23:28 -- setup/driver.sh@30 -- # return 0 00:05:37.241 08:23:28 -- setup/driver.sh@37 -- # echo vfio-pci 00:05:37.241 08:23:28 -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:37.241 08:23:28 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:37.241 08:23:28 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:37.241 Looking for driver=vfio-pci 00:05:37.241 08:23:28 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.241 08:23:28 -- setup/driver.sh@45 -- # setup output config 00:05:37.241 08:23:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:37.241 08:23:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.778 08:23:32 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.778 08:23:32 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.778 08:23:32 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:41.687 08:23:33 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:41.687 08:23:33 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:41.687 08:23:33 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:41.687 08:23:33 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:41.687 08:23:33 -- setup/driver.sh@65 -- # setup reset 00:05:41.687 08:23:33 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:41.687 08:23:33 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:46.964 00:05:46.964 real 0m9.894s 00:05:46.964 user 0m2.676s 00:05:46.964 sys 0m5.000s 00:05:46.964 08:23:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:46.964 08:23:38 -- common/autotest_common.sh@10 -- # set +x 00:05:46.964 ************************************ 00:05:46.964 END TEST guess_driver 00:05:46.964 ************************************ 00:05:46.964 00:05:46.964 real 0m14.618s 00:05:46.964 user 0m4.023s 00:05:46.964 sys 0m7.614s 00:05:46.964 08:23:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:46.964 08:23:38 -- common/autotest_common.sh@10 -- # set +x 00:05:46.964 ************************************ 00:05:46.964 END TEST driver 00:05:46.964 ************************************ 00:05:46.964 08:23:38 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:46.964 08:23:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:46.964 08:23:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:46.964 08:23:38 -- common/autotest_common.sh@10 -- # set +x 00:05:46.964 ************************************ 00:05:46.964 START TEST devices 00:05:46.964 ************************************ 00:05:46.964 08:23:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:46.964 * Looking for test storage... 00:05:46.964 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:46.964 08:23:38 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:46.964 08:23:38 -- setup/devices.sh@192 -- # setup reset 00:05:46.964 08:23:38 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:46.964 08:23:38 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:50.253 08:23:42 -- setup/devices.sh@194 -- # get_zoned_devs 00:05:50.253 08:23:42 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:05:50.253 08:23:42 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:05:50.253 08:23:42 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:05:50.253 08:23:42 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:05:50.253 08:23:42 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:05:50.253 08:23:42 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:05:50.253 08:23:42 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:50.253 08:23:42 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:05:50.253 08:23:42 -- setup/devices.sh@196 -- # blocks=() 00:05:50.253 08:23:42 -- setup/devices.sh@196 -- # declare -a blocks 00:05:50.253 08:23:42 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:50.253 08:23:42 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:50.253 08:23:42 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:50.253 08:23:42 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:50.253 08:23:42 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:50.253 08:23:42 -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:50.253 08:23:42 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:50.253 08:23:42 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:50.253 08:23:42 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:50.253 08:23:42 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:05:50.253 08:23:42 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:50.253 No valid GPT data, bailing 00:05:50.253 08:23:42 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:50.253 08:23:42 -- scripts/common.sh@393 -- # pt= 00:05:50.253 08:23:42 -- scripts/common.sh@394 -- # return 1 00:05:50.253 08:23:42 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:50.253 08:23:42 -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:50.253 08:23:42 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:50.253 08:23:42 -- setup/common.sh@80 -- # echo 1600321314816 00:05:50.253 08:23:42 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:50.253 08:23:42 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:50.253 08:23:42 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:50.253 08:23:42 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:50.253 08:23:42 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:50.253 08:23:42 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:50.253 08:23:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:50.253 08:23:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:50.253 08:23:42 -- common/autotest_common.sh@10 -- # set +x 00:05:50.253 ************************************ 00:05:50.253 START TEST nvme_mount 00:05:50.253 ************************************ 00:05:50.253 08:23:42 -- common/autotest_common.sh@1104 -- # nvme_mount 00:05:50.253 08:23:42 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:50.253 08:23:42 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:50.253 08:23:42 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:50.253 08:23:42 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:50.253 08:23:42 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:50.253 08:23:42 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:50.253 08:23:42 -- setup/common.sh@40 -- # local part_no=1 00:05:50.253 08:23:42 -- setup/common.sh@41 -- # local size=1073741824 00:05:50.253 08:23:42 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:50.253 08:23:42 -- setup/common.sh@44 -- # parts=() 00:05:50.253 08:23:42 -- setup/common.sh@44 -- # local parts 00:05:50.253 08:23:42 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:50.253 08:23:42 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:50.253 08:23:42 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:50.253 08:23:42 -- setup/common.sh@46 -- # (( part++ )) 00:05:50.253 08:23:42 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:50.253 08:23:42 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:50.253 08:23:42 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:50.253 08:23:42 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:51.189 Creating new GPT entries in memory. 00:05:51.189 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:51.189 other utilities. 00:05:51.189 08:23:43 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:51.189 08:23:43 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:51.189 08:23:43 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:51.189 08:23:43 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:51.189 08:23:43 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:52.124 Creating new GPT entries in memory. 00:05:52.124 The operation has completed successfully. 00:05:52.124 08:23:44 -- setup/common.sh@57 -- # (( part++ )) 00:05:52.124 08:23:44 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:52.124 08:23:44 -- setup/common.sh@62 -- # wait 973160 00:05:52.382 08:23:44 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:52.382 08:23:44 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:52.382 08:23:44 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:52.382 08:23:44 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:52.382 08:23:44 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:52.382 08:23:44 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:52.382 08:23:44 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:52.382 08:23:44 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:52.382 08:23:44 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:52.382 08:23:44 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:52.382 08:23:44 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:52.382 08:23:44 -- setup/devices.sh@53 -- # local found=0 00:05:52.382 08:23:44 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:52.382 08:23:44 -- setup/devices.sh@56 -- # : 00:05:52.382 08:23:44 -- setup/devices.sh@59 -- # local pci status 00:05:52.382 08:23:44 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.382 08:23:44 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:52.382 08:23:44 -- setup/devices.sh@47 -- # setup output config 00:05:52.382 08:23:44 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:52.382 08:23:44 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:55.751 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:55.752 08:23:47 -- setup/devices.sh@63 -- # found=1 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:47 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:55.752 08:23:47 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:55.752 08:23:47 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.752 08:23:47 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:55.752 08:23:47 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:55.752 08:23:47 -- setup/devices.sh@110 -- # cleanup_nvme 00:05:55.752 08:23:47 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.752 08:23:47 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.752 08:23:47 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:55.752 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:55.752 08:23:47 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:55.752 08:23:47 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:55.752 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:55.752 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:55.752 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:55.752 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:55.752 08:23:48 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:55.752 08:23:48 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:55.752 08:23:48 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.752 08:23:48 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:55.752 08:23:48 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:55.752 08:23:48 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.752 08:23:48 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:55.752 08:23:48 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:55.752 08:23:48 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:55.752 08:23:48 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.752 08:23:48 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:55.752 08:23:48 -- setup/devices.sh@53 -- # local found=0 00:05:55.752 08:23:48 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:55.752 08:23:48 -- setup/devices.sh@56 -- # : 00:05:55.752 08:23:48 -- setup/devices.sh@59 -- # local pci status 00:05:55.752 08:23:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.752 08:23:48 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:55.752 08:23:48 -- setup/devices.sh@47 -- # setup output config 00:05:55.752 08:23:48 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:55.752 08:23:48 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:59.042 08:23:51 -- setup/devices.sh@63 -- # found=1 00:05:59.042 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.042 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.042 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.042 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.042 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.042 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.042 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.042 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.042 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:59.043 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.043 08:23:51 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:59.043 08:23:51 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:59.043 08:23:51 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.043 08:23:51 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:59.043 08:23:51 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:59.303 08:23:51 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.303 08:23:51 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:59.303 08:23:51 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:59.303 08:23:51 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:59.303 08:23:51 -- setup/devices.sh@50 -- # local mount_point= 00:05:59.303 08:23:51 -- setup/devices.sh@51 -- # local test_file= 00:05:59.303 08:23:51 -- setup/devices.sh@53 -- # local found=0 00:05:59.303 08:23:51 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:59.303 08:23:51 -- setup/devices.sh@59 -- # local pci status 00:05:59.303 08:23:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.303 08:23:51 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:59.303 08:23:51 -- setup/devices.sh@47 -- # setup output config 00:05:59.303 08:23:51 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:59.303 08:23:51 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:02.590 08:23:54 -- setup/devices.sh@63 -- # found=1 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.590 08:23:54 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:02.590 08:23:54 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:02.590 08:23:54 -- setup/devices.sh@68 -- # return 0 00:06:02.590 08:23:54 -- setup/devices.sh@128 -- # cleanup_nvme 00:06:02.590 08:23:54 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.590 08:23:54 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:02.590 08:23:54 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:02.590 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:02.590 00:06:02.590 real 0m12.290s 00:06:02.590 user 0m3.499s 00:06:02.590 sys 0m6.684s 00:06:02.590 08:23:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:02.590 08:23:54 -- common/autotest_common.sh@10 -- # set +x 00:06:02.590 ************************************ 00:06:02.590 END TEST nvme_mount 00:06:02.590 ************************************ 00:06:02.590 08:23:55 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:02.590 08:23:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:02.590 08:23:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:02.590 08:23:55 -- common/autotest_common.sh@10 -- # set +x 00:06:02.590 ************************************ 00:06:02.590 START TEST dm_mount 00:06:02.590 ************************************ 00:06:02.590 08:23:55 -- common/autotest_common.sh@1104 -- # dm_mount 00:06:02.590 08:23:55 -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:02.590 08:23:55 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:02.590 08:23:55 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:02.591 08:23:55 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:02.591 08:23:55 -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:02.591 08:23:55 -- setup/common.sh@40 -- # local part_no=2 00:06:02.591 08:23:55 -- setup/common.sh@41 -- # local size=1073741824 00:06:02.591 08:23:55 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:02.591 08:23:55 -- setup/common.sh@44 -- # parts=() 00:06:02.591 08:23:55 -- setup/common.sh@44 -- # local parts 00:06:02.591 08:23:55 -- setup/common.sh@46 -- # (( part = 1 )) 00:06:02.591 08:23:55 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:02.591 08:23:55 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:02.591 08:23:55 -- setup/common.sh@46 -- # (( part++ )) 00:06:02.591 08:23:55 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:02.591 08:23:55 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:02.591 08:23:55 -- setup/common.sh@46 -- # (( part++ )) 00:06:02.591 08:23:55 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:02.591 08:23:55 -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:02.591 08:23:55 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:02.591 08:23:55 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:03.529 Creating new GPT entries in memory. 00:06:03.529 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:03.529 other utilities. 00:06:03.529 08:23:56 -- setup/common.sh@57 -- # (( part = 1 )) 00:06:03.529 08:23:56 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:03.529 08:23:56 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:03.529 08:23:56 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:03.529 08:23:56 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:04.467 Creating new GPT entries in memory. 00:06:04.467 The operation has completed successfully. 00:06:04.467 08:23:57 -- setup/common.sh@57 -- # (( part++ )) 00:06:04.467 08:23:57 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:04.467 08:23:57 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:04.467 08:23:57 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:04.467 08:23:57 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:05.846 The operation has completed successfully. 00:06:05.846 08:23:58 -- setup/common.sh@57 -- # (( part++ )) 00:06:05.846 08:23:58 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:05.846 08:23:58 -- setup/common.sh@62 -- # wait 977657 00:06:05.846 08:23:58 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:05.846 08:23:58 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.846 08:23:58 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:05.846 08:23:58 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:05.846 08:23:58 -- setup/devices.sh@160 -- # for t in {1..5} 00:06:05.846 08:23:58 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:05.846 08:23:58 -- setup/devices.sh@161 -- # break 00:06:05.846 08:23:58 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:05.846 08:23:58 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:05.846 08:23:58 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:05.846 08:23:58 -- setup/devices.sh@166 -- # dm=dm-0 00:06:05.846 08:23:58 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:05.846 08:23:58 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:05.846 08:23:58 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.846 08:23:58 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:05.846 08:23:58 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.846 08:23:58 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:05.846 08:23:58 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:05.846 08:23:58 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.846 08:23:58 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:05.846 08:23:58 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:05.846 08:23:58 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:05.846 08:23:58 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.846 08:23:58 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:05.846 08:23:58 -- setup/devices.sh@53 -- # local found=0 00:06:05.846 08:23:58 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:05.846 08:23:58 -- setup/devices.sh@56 -- # : 00:06:05.846 08:23:58 -- setup/devices.sh@59 -- # local pci status 00:06:05.846 08:23:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:05.846 08:23:58 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:05.846 08:23:58 -- setup/devices.sh@47 -- # setup output config 00:06:05.846 08:23:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:05.846 08:23:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:09.132 08:24:01 -- setup/devices.sh@63 -- # found=1 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:09.132 08:24:01 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:09.132 08:24:01 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:09.132 08:24:01 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:09.132 08:24:01 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:09.132 08:24:01 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:09.132 08:24:01 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:09.132 08:24:01 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:09.132 08:24:01 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:09.132 08:24:01 -- setup/devices.sh@50 -- # local mount_point= 00:06:09.132 08:24:01 -- setup/devices.sh@51 -- # local test_file= 00:06:09.132 08:24:01 -- setup/devices.sh@53 -- # local found=0 00:06:09.132 08:24:01 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:09.132 08:24:01 -- setup/devices.sh@59 -- # local pci status 00:06:09.132 08:24:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.132 08:24:01 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:09.132 08:24:01 -- setup/devices.sh@47 -- # setup output config 00:06:09.132 08:24:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:09.132 08:24:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:12.423 08:24:04 -- setup/devices.sh@63 -- # found=1 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:12.423 08:24:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.423 08:24:04 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:12.423 08:24:04 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:12.423 08:24:04 -- setup/devices.sh@68 -- # return 0 00:06:12.423 08:24:04 -- setup/devices.sh@187 -- # cleanup_dm 00:06:12.423 08:24:04 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.423 08:24:04 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:12.423 08:24:04 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:12.423 08:24:05 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:12.423 08:24:05 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:12.423 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:12.423 08:24:05 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:12.423 08:24:05 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:12.423 00:06:12.423 real 0m10.000s 00:06:12.423 user 0m2.455s 00:06:12.423 sys 0m4.631s 00:06:12.423 08:24:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.423 08:24:05 -- common/autotest_common.sh@10 -- # set +x 00:06:12.423 ************************************ 00:06:12.423 END TEST dm_mount 00:06:12.423 ************************************ 00:06:12.423 08:24:05 -- setup/devices.sh@1 -- # cleanup 00:06:12.423 08:24:05 -- setup/devices.sh@11 -- # cleanup_nvme 00:06:12.423 08:24:05 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:12.423 08:24:05 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:12.423 08:24:05 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:12.423 08:24:05 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:12.423 08:24:05 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:12.681 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:12.681 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:12.681 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:12.681 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:12.681 08:24:05 -- setup/devices.sh@12 -- # cleanup_dm 00:06:12.681 08:24:05 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.940 08:24:05 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:12.940 08:24:05 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:12.940 08:24:05 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:12.940 08:24:05 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:12.940 08:24:05 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:12.940 00:06:12.940 real 0m26.512s 00:06:12.940 user 0m7.432s 00:06:12.940 sys 0m13.980s 00:06:12.940 08:24:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.940 08:24:05 -- common/autotest_common.sh@10 -- # set +x 00:06:12.940 ************************************ 00:06:12.940 END TEST devices 00:06:12.940 ************************************ 00:06:12.940 00:06:12.940 real 1m31.913s 00:06:12.940 user 0m28.371s 00:06:12.940 sys 0m52.138s 00:06:12.940 08:24:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.940 08:24:05 -- common/autotest_common.sh@10 -- # set +x 00:06:12.940 ************************************ 00:06:12.940 END TEST setup.sh 00:06:12.940 ************************************ 00:06:12.940 08:24:05 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:16.225 Hugepages 00:06:16.225 node hugesize free / total 00:06:16.225 node0 1048576kB 0 / 0 00:06:16.225 node0 2048kB 2048 / 2048 00:06:16.225 node1 1048576kB 0 / 0 00:06:16.225 node1 2048kB 0 / 0 00:06:16.225 00:06:16.225 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:16.225 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:16.225 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:16.225 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:16.225 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:16.225 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:16.225 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:16.225 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:16.225 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:16.225 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:16.225 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:16.225 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:16.225 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:16.225 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:16.226 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:16.226 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:16.226 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:16.226 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:16.226 08:24:08 -- spdk/autotest.sh@141 -- # uname -s 00:06:16.226 08:24:08 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:06:16.226 08:24:08 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:06:16.226 08:24:08 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:19.514 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:19.514 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:19.773 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:19.773 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:21.152 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:21.152 08:24:13 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:22.528 08:24:14 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:22.528 08:24:14 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:22.528 08:24:14 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:06:22.528 08:24:14 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:06:22.528 08:24:14 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:22.528 08:24:14 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:22.528 08:24:14 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:22.528 08:24:14 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:22.528 08:24:14 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:22.528 08:24:14 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:22.528 08:24:14 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:22.528 08:24:14 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:25.818 Waiting for block devices as requested 00:06:25.818 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:25.818 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:25.818 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:26.077 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:26.077 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:26.077 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:26.337 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:26.337 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:26.337 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:26.337 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:26.596 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:26.596 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:26.596 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:26.856 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:26.856 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:26.856 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:27.114 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:27.114 08:24:19 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:06:27.114 08:24:19 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:27.114 08:24:19 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:27.114 08:24:19 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:06:27.373 08:24:19 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:27.373 08:24:19 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:27.373 08:24:19 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:27.373 08:24:19 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:27.373 08:24:19 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:06:27.373 08:24:19 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:06:27.373 08:24:19 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:06:27.373 08:24:19 -- common/autotest_common.sh@1530 -- # grep oacs 00:06:27.373 08:24:19 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:06:27.373 08:24:19 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:06:27.373 08:24:19 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:06:27.373 08:24:19 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:06:27.373 08:24:19 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:06:27.373 08:24:19 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:06:27.373 08:24:19 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:06:27.373 08:24:19 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:06:27.373 08:24:19 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:06:27.373 08:24:19 -- common/autotest_common.sh@1542 -- # continue 00:06:27.373 08:24:19 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:06:27.373 08:24:19 -- common/autotest_common.sh@718 -- # xtrace_disable 00:06:27.373 08:24:19 -- common/autotest_common.sh@10 -- # set +x 00:06:27.373 08:24:19 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:06:27.373 08:24:19 -- common/autotest_common.sh@712 -- # xtrace_disable 00:06:27.373 08:24:19 -- common/autotest_common.sh@10 -- # set +x 00:06:27.373 08:24:19 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:30.661 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:30.661 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:31.664 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:31.924 08:24:24 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:06:31.924 08:24:24 -- common/autotest_common.sh@718 -- # xtrace_disable 00:06:31.924 08:24:24 -- common/autotest_common.sh@10 -- # set +x 00:06:31.924 08:24:24 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:06:31.924 08:24:24 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:06:31.924 08:24:24 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:06:31.924 08:24:24 -- common/autotest_common.sh@1562 -- # bdfs=() 00:06:31.924 08:24:24 -- common/autotest_common.sh@1562 -- # local bdfs 00:06:31.924 08:24:24 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:06:31.924 08:24:24 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:31.924 08:24:24 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:31.924 08:24:24 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:31.924 08:24:24 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:31.924 08:24:24 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:31.924 08:24:24 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:31.924 08:24:24 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:31.924 08:24:24 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:06:31.924 08:24:24 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:31.924 08:24:24 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:06:31.924 08:24:24 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:31.924 08:24:24 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:06:31.924 08:24:24 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:06:31.924 08:24:24 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:06:31.924 08:24:24 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=987883 00:06:31.924 08:24:24 -- common/autotest_common.sh@1583 -- # waitforlisten 987883 00:06:31.924 08:24:24 -- common/autotest_common.sh@819 -- # '[' -z 987883 ']' 00:06:31.924 08:24:24 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.924 08:24:24 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:31.924 08:24:24 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.924 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.924 08:24:24 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:31.924 08:24:24 -- common/autotest_common.sh@10 -- # set +x 00:06:31.924 08:24:24 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:31.924 [2024-10-04 08:24:24.544527] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:31.924 [2024-10-04 08:24:24.544590] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid987883 ] 00:06:31.924 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.183 [2024-10-04 08:24:24.611755] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.183 [2024-10-04 08:24:24.650371] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:32.183 [2024-10-04 08:24:24.650484] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.750 08:24:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:32.750 08:24:25 -- common/autotest_common.sh@852 -- # return 0 00:06:32.750 08:24:25 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:06:32.750 08:24:25 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:06:32.750 08:24:25 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:36.034 nvme0n1 00:06:36.034 08:24:28 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:36.034 [2024-10-04 08:24:28.550105] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:36.034 request: 00:06:36.034 { 00:06:36.034 "nvme_ctrlr_name": "nvme0", 00:06:36.034 "password": "test", 00:06:36.034 "method": "bdev_nvme_opal_revert", 00:06:36.034 "req_id": 1 00:06:36.034 } 00:06:36.034 Got JSON-RPC error response 00:06:36.034 response: 00:06:36.034 { 00:06:36.034 "code": -32602, 00:06:36.034 "message": "Invalid parameters" 00:06:36.034 } 00:06:36.034 08:24:28 -- common/autotest_common.sh@1589 -- # true 00:06:36.034 08:24:28 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:06:36.034 08:24:28 -- common/autotest_common.sh@1593 -- # killprocess 987883 00:06:36.034 08:24:28 -- common/autotest_common.sh@926 -- # '[' -z 987883 ']' 00:06:36.034 08:24:28 -- common/autotest_common.sh@930 -- # kill -0 987883 00:06:36.034 08:24:28 -- common/autotest_common.sh@931 -- # uname 00:06:36.034 08:24:28 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:36.034 08:24:28 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 987883 00:06:36.034 08:24:28 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:36.034 08:24:28 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:36.035 08:24:28 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 987883' 00:06:36.035 killing process with pid 987883 00:06:36.035 08:24:28 -- common/autotest_common.sh@945 -- # kill 987883 00:06:36.035 08:24:28 -- common/autotest_common.sh@950 -- # wait 987883 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.035 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.293 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.293 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.293 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.293 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.293 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.293 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.293 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:36.294 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:38.193 08:24:30 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:06:38.193 08:24:30 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:06:38.193 08:24:30 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:06:38.193 08:24:30 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:06:38.193 08:24:30 -- spdk/autotest.sh@173 -- # timing_enter lib 00:06:38.193 08:24:30 -- common/autotest_common.sh@712 -- # xtrace_disable 00:06:38.193 08:24:30 -- common/autotest_common.sh@10 -- # set +x 00:06:38.193 08:24:30 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:38.193 08:24:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:38.193 08:24:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.193 08:24:30 -- common/autotest_common.sh@10 -- # set +x 00:06:38.193 ************************************ 00:06:38.193 START TEST env 00:06:38.193 ************************************ 00:06:38.193 08:24:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:38.452 * Looking for test storage... 00:06:38.452 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:38.452 08:24:30 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:38.452 08:24:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:38.452 08:24:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.452 08:24:30 -- common/autotest_common.sh@10 -- # set +x 00:06:38.452 ************************************ 00:06:38.452 START TEST env_memory 00:06:38.452 ************************************ 00:06:38.452 08:24:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:38.452 00:06:38.452 00:06:38.452 CUnit - A unit testing framework for C - Version 2.1-3 00:06:38.452 http://cunit.sourceforge.net/ 00:06:38.452 00:06:38.452 00:06:38.452 Suite: memory 00:06:38.452 Test: alloc and free memory map ...[2024-10-04 08:24:30.971058] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:38.452 passed 00:06:38.452 Test: mem map translation ...[2024-10-04 08:24:30.984968] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:38.452 [2024-10-04 08:24:30.984986] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:38.452 [2024-10-04 08:24:30.985017] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:38.452 [2024-10-04 08:24:30.985025] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:38.452 passed 00:06:38.452 Test: mem map registration ...[2024-10-04 08:24:31.005765] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:38.452 [2024-10-04 08:24:31.005782] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:38.452 passed 00:06:38.452 Test: mem map adjacent registrations ...passed 00:06:38.452 00:06:38.452 Run Summary: Type Total Ran Passed Failed Inactive 00:06:38.452 suites 1 1 n/a 0 0 00:06:38.452 tests 4 4 4 0 0 00:06:38.452 asserts 152 152 152 0 n/a 00:06:38.452 00:06:38.452 Elapsed time = 0.086 seconds 00:06:38.452 00:06:38.452 real 0m0.099s 00:06:38.452 user 0m0.086s 00:06:38.453 sys 0m0.013s 00:06:38.453 08:24:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.453 08:24:31 -- common/autotest_common.sh@10 -- # set +x 00:06:38.453 ************************************ 00:06:38.453 END TEST env_memory 00:06:38.453 ************************************ 00:06:38.453 08:24:31 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:38.453 08:24:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:38.453 08:24:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.453 08:24:31 -- common/autotest_common.sh@10 -- # set +x 00:06:38.453 ************************************ 00:06:38.453 START TEST env_vtophys 00:06:38.453 ************************************ 00:06:38.453 08:24:31 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:38.453 EAL: lib.eal log level changed from notice to debug 00:06:38.453 EAL: Detected lcore 0 as core 0 on socket 0 00:06:38.453 EAL: Detected lcore 1 as core 1 on socket 0 00:06:38.453 EAL: Detected lcore 2 as core 2 on socket 0 00:06:38.453 EAL: Detected lcore 3 as core 3 on socket 0 00:06:38.453 EAL: Detected lcore 4 as core 4 on socket 0 00:06:38.453 EAL: Detected lcore 5 as core 5 on socket 0 00:06:38.453 EAL: Detected lcore 6 as core 6 on socket 0 00:06:38.453 EAL: Detected lcore 7 as core 8 on socket 0 00:06:38.453 EAL: Detected lcore 8 as core 9 on socket 0 00:06:38.453 EAL: Detected lcore 9 as core 10 on socket 0 00:06:38.453 EAL: Detected lcore 10 as core 11 on socket 0 00:06:38.453 EAL: Detected lcore 11 as core 12 on socket 0 00:06:38.453 EAL: Detected lcore 12 as core 13 on socket 0 00:06:38.453 EAL: Detected lcore 13 as core 14 on socket 0 00:06:38.453 EAL: Detected lcore 14 as core 16 on socket 0 00:06:38.453 EAL: Detected lcore 15 as core 17 on socket 0 00:06:38.453 EAL: Detected lcore 16 as core 18 on socket 0 00:06:38.453 EAL: Detected lcore 17 as core 19 on socket 0 00:06:38.453 EAL: Detected lcore 18 as core 20 on socket 0 00:06:38.453 EAL: Detected lcore 19 as core 21 on socket 0 00:06:38.453 EAL: Detected lcore 20 as core 22 on socket 0 00:06:38.453 EAL: Detected lcore 21 as core 24 on socket 0 00:06:38.453 EAL: Detected lcore 22 as core 25 on socket 0 00:06:38.453 EAL: Detected lcore 23 as core 26 on socket 0 00:06:38.453 EAL: Detected lcore 24 as core 27 on socket 0 00:06:38.453 EAL: Detected lcore 25 as core 28 on socket 0 00:06:38.453 EAL: Detected lcore 26 as core 29 on socket 0 00:06:38.453 EAL: Detected lcore 27 as core 30 on socket 0 00:06:38.453 EAL: Detected lcore 28 as core 0 on socket 1 00:06:38.453 EAL: Detected lcore 29 as core 1 on socket 1 00:06:38.453 EAL: Detected lcore 30 as core 2 on socket 1 00:06:38.453 EAL: Detected lcore 31 as core 3 on socket 1 00:06:38.453 EAL: Detected lcore 32 as core 4 on socket 1 00:06:38.453 EAL: Detected lcore 33 as core 5 on socket 1 00:06:38.453 EAL: Detected lcore 34 as core 6 on socket 1 00:06:38.453 EAL: Detected lcore 35 as core 8 on socket 1 00:06:38.453 EAL: Detected lcore 36 as core 9 on socket 1 00:06:38.453 EAL: Detected lcore 37 as core 10 on socket 1 00:06:38.453 EAL: Detected lcore 38 as core 11 on socket 1 00:06:38.453 EAL: Detected lcore 39 as core 12 on socket 1 00:06:38.453 EAL: Detected lcore 40 as core 13 on socket 1 00:06:38.453 EAL: Detected lcore 41 as core 14 on socket 1 00:06:38.453 EAL: Detected lcore 42 as core 16 on socket 1 00:06:38.453 EAL: Detected lcore 43 as core 17 on socket 1 00:06:38.453 EAL: Detected lcore 44 as core 18 on socket 1 00:06:38.453 EAL: Detected lcore 45 as core 19 on socket 1 00:06:38.453 EAL: Detected lcore 46 as core 20 on socket 1 00:06:38.453 EAL: Detected lcore 47 as core 21 on socket 1 00:06:38.453 EAL: Detected lcore 48 as core 22 on socket 1 00:06:38.453 EAL: Detected lcore 49 as core 24 on socket 1 00:06:38.453 EAL: Detected lcore 50 as core 25 on socket 1 00:06:38.453 EAL: Detected lcore 51 as core 26 on socket 1 00:06:38.453 EAL: Detected lcore 52 as core 27 on socket 1 00:06:38.453 EAL: Detected lcore 53 as core 28 on socket 1 00:06:38.453 EAL: Detected lcore 54 as core 29 on socket 1 00:06:38.453 EAL: Detected lcore 55 as core 30 on socket 1 00:06:38.453 EAL: Detected lcore 56 as core 0 on socket 0 00:06:38.453 EAL: Detected lcore 57 as core 1 on socket 0 00:06:38.453 EAL: Detected lcore 58 as core 2 on socket 0 00:06:38.453 EAL: Detected lcore 59 as core 3 on socket 0 00:06:38.453 EAL: Detected lcore 60 as core 4 on socket 0 00:06:38.453 EAL: Detected lcore 61 as core 5 on socket 0 00:06:38.453 EAL: Detected lcore 62 as core 6 on socket 0 00:06:38.453 EAL: Detected lcore 63 as core 8 on socket 0 00:06:38.453 EAL: Detected lcore 64 as core 9 on socket 0 00:06:38.453 EAL: Detected lcore 65 as core 10 on socket 0 00:06:38.453 EAL: Detected lcore 66 as core 11 on socket 0 00:06:38.453 EAL: Detected lcore 67 as core 12 on socket 0 00:06:38.453 EAL: Detected lcore 68 as core 13 on socket 0 00:06:38.453 EAL: Detected lcore 69 as core 14 on socket 0 00:06:38.453 EAL: Detected lcore 70 as core 16 on socket 0 00:06:38.453 EAL: Detected lcore 71 as core 17 on socket 0 00:06:38.453 EAL: Detected lcore 72 as core 18 on socket 0 00:06:38.453 EAL: Detected lcore 73 as core 19 on socket 0 00:06:38.453 EAL: Detected lcore 74 as core 20 on socket 0 00:06:38.453 EAL: Detected lcore 75 as core 21 on socket 0 00:06:38.453 EAL: Detected lcore 76 as core 22 on socket 0 00:06:38.453 EAL: Detected lcore 77 as core 24 on socket 0 00:06:38.453 EAL: Detected lcore 78 as core 25 on socket 0 00:06:38.453 EAL: Detected lcore 79 as core 26 on socket 0 00:06:38.453 EAL: Detected lcore 80 as core 27 on socket 0 00:06:38.453 EAL: Detected lcore 81 as core 28 on socket 0 00:06:38.453 EAL: Detected lcore 82 as core 29 on socket 0 00:06:38.453 EAL: Detected lcore 83 as core 30 on socket 0 00:06:38.453 EAL: Detected lcore 84 as core 0 on socket 1 00:06:38.453 EAL: Detected lcore 85 as core 1 on socket 1 00:06:38.453 EAL: Detected lcore 86 as core 2 on socket 1 00:06:38.453 EAL: Detected lcore 87 as core 3 on socket 1 00:06:38.453 EAL: Detected lcore 88 as core 4 on socket 1 00:06:38.453 EAL: Detected lcore 89 as core 5 on socket 1 00:06:38.453 EAL: Detected lcore 90 as core 6 on socket 1 00:06:38.453 EAL: Detected lcore 91 as core 8 on socket 1 00:06:38.453 EAL: Detected lcore 92 as core 9 on socket 1 00:06:38.453 EAL: Detected lcore 93 as core 10 on socket 1 00:06:38.453 EAL: Detected lcore 94 as core 11 on socket 1 00:06:38.453 EAL: Detected lcore 95 as core 12 on socket 1 00:06:38.453 EAL: Detected lcore 96 as core 13 on socket 1 00:06:38.453 EAL: Detected lcore 97 as core 14 on socket 1 00:06:38.453 EAL: Detected lcore 98 as core 16 on socket 1 00:06:38.453 EAL: Detected lcore 99 as core 17 on socket 1 00:06:38.453 EAL: Detected lcore 100 as core 18 on socket 1 00:06:38.453 EAL: Detected lcore 101 as core 19 on socket 1 00:06:38.453 EAL: Detected lcore 102 as core 20 on socket 1 00:06:38.453 EAL: Detected lcore 103 as core 21 on socket 1 00:06:38.453 EAL: Detected lcore 104 as core 22 on socket 1 00:06:38.453 EAL: Detected lcore 105 as core 24 on socket 1 00:06:38.453 EAL: Detected lcore 106 as core 25 on socket 1 00:06:38.453 EAL: Detected lcore 107 as core 26 on socket 1 00:06:38.453 EAL: Detected lcore 108 as core 27 on socket 1 00:06:38.453 EAL: Detected lcore 109 as core 28 on socket 1 00:06:38.453 EAL: Detected lcore 110 as core 29 on socket 1 00:06:38.453 EAL: Detected lcore 111 as core 30 on socket 1 00:06:38.453 EAL: Maximum logical cores by configuration: 128 00:06:38.453 EAL: Detected CPU lcores: 112 00:06:38.453 EAL: Detected NUMA nodes: 2 00:06:38.453 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:06:38.453 EAL: Checking presence of .so 'librte_eal.so.23' 00:06:38.453 EAL: Checking presence of .so 'librte_eal.so' 00:06:38.453 EAL: Detected static linkage of DPDK 00:06:38.453 EAL: No shared files mode enabled, IPC will be disabled 00:06:38.712 EAL: Bus pci wants IOVA as 'DC' 00:06:38.712 EAL: Buses did not request a specific IOVA mode. 00:06:38.712 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:38.712 EAL: Selected IOVA mode 'VA' 00:06:38.712 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.712 EAL: Probing VFIO support... 00:06:38.712 EAL: IOMMU type 1 (Type 1) is supported 00:06:38.712 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:38.712 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:38.712 EAL: VFIO support initialized 00:06:38.712 EAL: Ask a virtual area of 0x2e000 bytes 00:06:38.712 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:38.712 EAL: Setting up physically contiguous memory... 00:06:38.712 EAL: Setting maximum number of open files to 524288 00:06:38.712 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:38.712 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:38.712 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:38.712 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.712 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:38.712 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:38.712 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.712 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:38.712 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:38.712 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.712 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:38.712 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:38.712 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.712 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:38.712 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:38.712 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.712 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:38.712 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:38.712 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.712 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:38.712 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:38.712 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.712 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:38.712 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:38.712 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.712 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:38.712 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:38.712 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:38.712 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.712 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:38.712 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:38.712 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.712 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:38.712 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:38.712 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.712 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:38.712 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:38.712 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.712 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:38.712 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:38.712 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.712 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:38.712 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:38.712 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.712 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:38.712 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:38.712 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.712 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:38.712 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:38.712 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.712 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:38.712 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:38.712 EAL: Hugepages will be freed exactly as allocated. 00:06:38.712 EAL: No shared files mode enabled, IPC is disabled 00:06:38.712 EAL: No shared files mode enabled, IPC is disabled 00:06:38.712 EAL: TSC frequency is ~2500000 KHz 00:06:38.712 EAL: Main lcore 0 is ready (tid=7f9153e5fa00;cpuset=[0]) 00:06:38.712 EAL: Trying to obtain current memory policy. 00:06:38.712 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.712 EAL: Restoring previous memory policy: 0 00:06:38.712 EAL: request: mp_malloc_sync 00:06:38.712 EAL: No shared files mode enabled, IPC is disabled 00:06:38.712 EAL: Heap on socket 0 was expanded by 2MB 00:06:38.712 EAL: No shared files mode enabled, IPC is disabled 00:06:38.712 EAL: Mem event callback 'spdk:(nil)' registered 00:06:38.712 00:06:38.712 00:06:38.712 CUnit - A unit testing framework for C - Version 2.1-3 00:06:38.712 http://cunit.sourceforge.net/ 00:06:38.712 00:06:38.712 00:06:38.712 Suite: components_suite 00:06:38.712 Test: vtophys_malloc_test ...passed 00:06:38.712 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:38.712 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.712 EAL: Restoring previous memory policy: 4 00:06:38.712 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.712 EAL: request: mp_malloc_sync 00:06:38.712 EAL: No shared files mode enabled, IPC is disabled 00:06:38.712 EAL: Heap on socket 0 was expanded by 4MB 00:06:38.712 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.712 EAL: request: mp_malloc_sync 00:06:38.712 EAL: No shared files mode enabled, IPC is disabled 00:06:38.712 EAL: Heap on socket 0 was shrunk by 4MB 00:06:38.712 EAL: Trying to obtain current memory policy. 00:06:38.712 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.712 EAL: Restoring previous memory policy: 4 00:06:38.712 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.712 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was expanded by 6MB 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was shrunk by 6MB 00:06:38.713 EAL: Trying to obtain current memory policy. 00:06:38.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.713 EAL: Restoring previous memory policy: 4 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was expanded by 10MB 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was shrunk by 10MB 00:06:38.713 EAL: Trying to obtain current memory policy. 00:06:38.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.713 EAL: Restoring previous memory policy: 4 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was expanded by 18MB 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was shrunk by 18MB 00:06:38.713 EAL: Trying to obtain current memory policy. 00:06:38.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.713 EAL: Restoring previous memory policy: 4 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was expanded by 34MB 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was shrunk by 34MB 00:06:38.713 EAL: Trying to obtain current memory policy. 00:06:38.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.713 EAL: Restoring previous memory policy: 4 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was expanded by 66MB 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was shrunk by 66MB 00:06:38.713 EAL: Trying to obtain current memory policy. 00:06:38.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.713 EAL: Restoring previous memory policy: 4 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was expanded by 130MB 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was shrunk by 130MB 00:06:38.713 EAL: Trying to obtain current memory policy. 00:06:38.713 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.713 EAL: Restoring previous memory policy: 4 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.713 EAL: request: mp_malloc_sync 00:06:38.713 EAL: No shared files mode enabled, IPC is disabled 00:06:38.713 EAL: Heap on socket 0 was expanded by 258MB 00:06:38.713 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.971 EAL: request: mp_malloc_sync 00:06:38.971 EAL: No shared files mode enabled, IPC is disabled 00:06:38.971 EAL: Heap on socket 0 was shrunk by 258MB 00:06:38.971 EAL: Trying to obtain current memory policy. 00:06:38.971 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.971 EAL: Restoring previous memory policy: 4 00:06:38.971 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.971 EAL: request: mp_malloc_sync 00:06:38.971 EAL: No shared files mode enabled, IPC is disabled 00:06:38.971 EAL: Heap on socket 0 was expanded by 514MB 00:06:38.971 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.971 EAL: request: mp_malloc_sync 00:06:38.971 EAL: No shared files mode enabled, IPC is disabled 00:06:38.971 EAL: Heap on socket 0 was shrunk by 514MB 00:06:38.971 EAL: Trying to obtain current memory policy. 00:06:38.971 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.229 EAL: Restoring previous memory policy: 4 00:06:39.229 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.229 EAL: request: mp_malloc_sync 00:06:39.229 EAL: No shared files mode enabled, IPC is disabled 00:06:39.229 EAL: Heap on socket 0 was expanded by 1026MB 00:06:39.488 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.488 EAL: request: mp_malloc_sync 00:06:39.488 EAL: No shared files mode enabled, IPC is disabled 00:06:39.488 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:39.488 passed 00:06:39.488 00:06:39.488 Run Summary: Type Total Ran Passed Failed Inactive 00:06:39.488 suites 1 1 n/a 0 0 00:06:39.488 tests 2 2 2 0 0 00:06:39.488 asserts 497 497 497 0 n/a 00:06:39.488 00:06:39.488 Elapsed time = 0.959 seconds 00:06:39.488 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.488 EAL: request: mp_malloc_sync 00:06:39.488 EAL: No shared files mode enabled, IPC is disabled 00:06:39.488 EAL: Heap on socket 0 was shrunk by 2MB 00:06:39.488 EAL: No shared files mode enabled, IPC is disabled 00:06:39.488 EAL: No shared files mode enabled, IPC is disabled 00:06:39.488 EAL: No shared files mode enabled, IPC is disabled 00:06:39.488 00:06:39.488 real 0m1.077s 00:06:39.488 user 0m0.625s 00:06:39.488 sys 0m0.424s 00:06:39.488 08:24:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:39.488 08:24:32 -- common/autotest_common.sh@10 -- # set +x 00:06:39.488 ************************************ 00:06:39.488 END TEST env_vtophys 00:06:39.488 ************************************ 00:06:39.746 08:24:32 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:39.746 08:24:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:39.746 08:24:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:39.746 08:24:32 -- common/autotest_common.sh@10 -- # set +x 00:06:39.746 ************************************ 00:06:39.746 START TEST env_pci 00:06:39.746 ************************************ 00:06:39.746 08:24:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:39.746 00:06:39.746 00:06:39.746 CUnit - A unit testing framework for C - Version 2.1-3 00:06:39.746 http://cunit.sourceforge.net/ 00:06:39.746 00:06:39.746 00:06:39.746 Suite: pci 00:06:39.746 Test: pci_hook ...[2024-10-04 08:24:32.228269] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 989319 has claimed it 00:06:39.746 EAL: Cannot find device (10000:00:01.0) 00:06:39.746 EAL: Failed to attach device on primary process 00:06:39.746 passed 00:06:39.746 00:06:39.746 Run Summary: Type Total Ran Passed Failed Inactive 00:06:39.746 suites 1 1 n/a 0 0 00:06:39.746 tests 1 1 1 0 0 00:06:39.746 asserts 25 25 25 0 n/a 00:06:39.746 00:06:39.746 Elapsed time = 0.036 seconds 00:06:39.746 00:06:39.746 real 0m0.055s 00:06:39.746 user 0m0.020s 00:06:39.746 sys 0m0.035s 00:06:39.746 08:24:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:39.746 08:24:32 -- common/autotest_common.sh@10 -- # set +x 00:06:39.746 ************************************ 00:06:39.746 END TEST env_pci 00:06:39.746 ************************************ 00:06:39.746 08:24:32 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:39.746 08:24:32 -- env/env.sh@15 -- # uname 00:06:39.746 08:24:32 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:39.746 08:24:32 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:39.746 08:24:32 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:39.746 08:24:32 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:06:39.746 08:24:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:39.746 08:24:32 -- common/autotest_common.sh@10 -- # set +x 00:06:39.746 ************************************ 00:06:39.746 START TEST env_dpdk_post_init 00:06:39.746 ************************************ 00:06:39.746 08:24:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:39.746 EAL: Detected CPU lcores: 112 00:06:39.746 EAL: Detected NUMA nodes: 2 00:06:39.746 EAL: Detected static linkage of DPDK 00:06:39.746 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:39.746 EAL: Selected IOVA mode 'VA' 00:06:39.746 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.746 EAL: VFIO support initialized 00:06:39.746 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:40.005 EAL: Using IOMMU type 1 (Type 1) 00:06:40.571 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:44.753 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:44.753 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:44.753 Starting DPDK initialization... 00:06:44.753 Starting SPDK post initialization... 00:06:44.753 SPDK NVMe probe 00:06:44.753 Attaching to 0000:d8:00.0 00:06:44.753 Attached to 0000:d8:00.0 00:06:44.753 Cleaning up... 00:06:44.753 00:06:44.753 real 0m4.660s 00:06:44.753 user 0m3.526s 00:06:44.753 sys 0m0.381s 00:06:44.753 08:24:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.753 08:24:36 -- common/autotest_common.sh@10 -- # set +x 00:06:44.753 ************************************ 00:06:44.753 END TEST env_dpdk_post_init 00:06:44.753 ************************************ 00:06:44.753 08:24:37 -- env/env.sh@26 -- # uname 00:06:44.753 08:24:37 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:44.753 08:24:37 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:44.753 08:24:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:44.753 08:24:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:44.753 08:24:37 -- common/autotest_common.sh@10 -- # set +x 00:06:44.753 ************************************ 00:06:44.753 START TEST env_mem_callbacks 00:06:44.753 ************************************ 00:06:44.753 08:24:37 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:44.753 EAL: Detected CPU lcores: 112 00:06:44.753 EAL: Detected NUMA nodes: 2 00:06:44.753 EAL: Detected static linkage of DPDK 00:06:44.753 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:44.753 EAL: Selected IOVA mode 'VA' 00:06:44.753 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.753 EAL: VFIO support initialized 00:06:44.753 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:44.753 00:06:44.753 00:06:44.753 CUnit - A unit testing framework for C - Version 2.1-3 00:06:44.753 http://cunit.sourceforge.net/ 00:06:44.753 00:06:44.753 00:06:44.753 Suite: memory 00:06:44.753 Test: test ... 00:06:44.753 register 0x200000200000 2097152 00:06:44.753 malloc 3145728 00:06:44.753 register 0x200000400000 4194304 00:06:44.753 buf 0x200000500000 len 3145728 PASSED 00:06:44.753 malloc 64 00:06:44.753 buf 0x2000004fff40 len 64 PASSED 00:06:44.753 malloc 4194304 00:06:44.753 register 0x200000800000 6291456 00:06:44.753 buf 0x200000a00000 len 4194304 PASSED 00:06:44.753 free 0x200000500000 3145728 00:06:44.753 free 0x2000004fff40 64 00:06:44.753 unregister 0x200000400000 4194304 PASSED 00:06:44.753 free 0x200000a00000 4194304 00:06:44.753 unregister 0x200000800000 6291456 PASSED 00:06:44.753 malloc 8388608 00:06:44.753 register 0x200000400000 10485760 00:06:44.753 buf 0x200000600000 len 8388608 PASSED 00:06:44.753 free 0x200000600000 8388608 00:06:44.753 unregister 0x200000400000 10485760 PASSED 00:06:44.753 passed 00:06:44.753 00:06:44.753 Run Summary: Type Total Ran Passed Failed Inactive 00:06:44.753 suites 1 1 n/a 0 0 00:06:44.753 tests 1 1 1 0 0 00:06:44.753 asserts 15 15 15 0 n/a 00:06:44.753 00:06:44.753 Elapsed time = 0.005 seconds 00:06:44.753 00:06:44.753 real 0m0.065s 00:06:44.753 user 0m0.019s 00:06:44.753 sys 0m0.046s 00:06:44.753 08:24:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.753 08:24:37 -- common/autotest_common.sh@10 -- # set +x 00:06:44.754 ************************************ 00:06:44.754 END TEST env_mem_callbacks 00:06:44.754 ************************************ 00:06:44.754 00:06:44.754 real 0m6.304s 00:06:44.754 user 0m4.403s 00:06:44.754 sys 0m1.169s 00:06:44.754 08:24:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.754 08:24:37 -- common/autotest_common.sh@10 -- # set +x 00:06:44.754 ************************************ 00:06:44.754 END TEST env 00:06:44.754 ************************************ 00:06:44.754 08:24:37 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:44.754 08:24:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:44.754 08:24:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:44.754 08:24:37 -- common/autotest_common.sh@10 -- # set +x 00:06:44.754 ************************************ 00:06:44.754 START TEST rpc 00:06:44.754 ************************************ 00:06:44.754 08:24:37 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:44.754 * Looking for test storage... 00:06:44.754 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:44.754 08:24:37 -- rpc/rpc.sh@65 -- # spdk_pid=990359 00:06:44.754 08:24:37 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:44.754 08:24:37 -- rpc/rpc.sh@67 -- # waitforlisten 990359 00:06:44.754 08:24:37 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:44.754 08:24:37 -- common/autotest_common.sh@819 -- # '[' -z 990359 ']' 00:06:44.754 08:24:37 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:44.754 08:24:37 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:44.754 08:24:37 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:44.754 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:44.754 08:24:37 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:44.754 08:24:37 -- common/autotest_common.sh@10 -- # set +x 00:06:44.754 [2024-10-04 08:24:37.295087] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:44.754 [2024-10-04 08:24:37.295138] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid990359 ] 00:06:44.754 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.754 [2024-10-04 08:24:37.359887] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.754 [2024-10-04 08:24:37.399385] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:44.754 [2024-10-04 08:24:37.399484] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:44.754 [2024-10-04 08:24:37.399495] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 990359' to capture a snapshot of events at runtime. 00:06:44.754 [2024-10-04 08:24:37.399504] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid990359 for offline analysis/debug. 00:06:44.754 [2024-10-04 08:24:37.399527] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.689 08:24:38 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:45.689 08:24:38 -- common/autotest_common.sh@852 -- # return 0 00:06:45.690 08:24:38 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:45.690 08:24:38 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:45.690 08:24:38 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:45.690 08:24:38 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:45.690 08:24:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:45.690 08:24:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:45.690 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.690 ************************************ 00:06:45.690 START TEST rpc_integrity 00:06:45.690 ************************************ 00:06:45.690 08:24:38 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:06:45.690 08:24:38 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:45.690 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.690 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.690 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.690 08:24:38 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:45.690 08:24:38 -- rpc/rpc.sh@13 -- # jq length 00:06:45.690 08:24:38 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:45.690 08:24:38 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:45.690 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.690 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.690 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.690 08:24:38 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:45.690 08:24:38 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:45.690 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.690 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.690 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.690 08:24:38 -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:45.690 { 00:06:45.690 "name": "Malloc0", 00:06:45.690 "aliases": [ 00:06:45.690 "66c643db-3b61-427d-8ed7-f761181f03e6" 00:06:45.690 ], 00:06:45.690 "product_name": "Malloc disk", 00:06:45.690 "block_size": 512, 00:06:45.690 "num_blocks": 16384, 00:06:45.690 "uuid": "66c643db-3b61-427d-8ed7-f761181f03e6", 00:06:45.690 "assigned_rate_limits": { 00:06:45.690 "rw_ios_per_sec": 0, 00:06:45.690 "rw_mbytes_per_sec": 0, 00:06:45.690 "r_mbytes_per_sec": 0, 00:06:45.690 "w_mbytes_per_sec": 0 00:06:45.690 }, 00:06:45.690 "claimed": false, 00:06:45.690 "zoned": false, 00:06:45.690 "supported_io_types": { 00:06:45.690 "read": true, 00:06:45.690 "write": true, 00:06:45.690 "unmap": true, 00:06:45.690 "write_zeroes": true, 00:06:45.690 "flush": true, 00:06:45.690 "reset": true, 00:06:45.690 "compare": false, 00:06:45.690 "compare_and_write": false, 00:06:45.690 "abort": true, 00:06:45.690 "nvme_admin": false, 00:06:45.690 "nvme_io": false 00:06:45.690 }, 00:06:45.690 "memory_domains": [ 00:06:45.690 { 00:06:45.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:45.690 "dma_device_type": 2 00:06:45.690 } 00:06:45.690 ], 00:06:45.690 "driver_specific": {} 00:06:45.690 } 00:06:45.690 ]' 00:06:45.690 08:24:38 -- rpc/rpc.sh@17 -- # jq length 00:06:45.690 08:24:38 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:45.690 08:24:38 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:45.690 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.690 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.690 [2024-10-04 08:24:38.255058] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:45.690 [2024-10-04 08:24:38.255093] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:45.690 [2024-10-04 08:24:38.255115] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x56c3850 00:06:45.690 [2024-10-04 08:24:38.255125] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:45.690 [2024-10-04 08:24:38.255939] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:45.690 [2024-10-04 08:24:38.255963] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:45.690 Passthru0 00:06:45.690 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.690 08:24:38 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:45.690 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.690 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.690 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.690 08:24:38 -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:45.690 { 00:06:45.690 "name": "Malloc0", 00:06:45.690 "aliases": [ 00:06:45.690 "66c643db-3b61-427d-8ed7-f761181f03e6" 00:06:45.690 ], 00:06:45.690 "product_name": "Malloc disk", 00:06:45.690 "block_size": 512, 00:06:45.690 "num_blocks": 16384, 00:06:45.690 "uuid": "66c643db-3b61-427d-8ed7-f761181f03e6", 00:06:45.690 "assigned_rate_limits": { 00:06:45.690 "rw_ios_per_sec": 0, 00:06:45.690 "rw_mbytes_per_sec": 0, 00:06:45.690 "r_mbytes_per_sec": 0, 00:06:45.690 "w_mbytes_per_sec": 0 00:06:45.690 }, 00:06:45.690 "claimed": true, 00:06:45.690 "claim_type": "exclusive_write", 00:06:45.690 "zoned": false, 00:06:45.690 "supported_io_types": { 00:06:45.690 "read": true, 00:06:45.690 "write": true, 00:06:45.690 "unmap": true, 00:06:45.690 "write_zeroes": true, 00:06:45.690 "flush": true, 00:06:45.690 "reset": true, 00:06:45.690 "compare": false, 00:06:45.690 "compare_and_write": false, 00:06:45.690 "abort": true, 00:06:45.690 "nvme_admin": false, 00:06:45.690 "nvme_io": false 00:06:45.690 }, 00:06:45.690 "memory_domains": [ 00:06:45.690 { 00:06:45.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:45.690 "dma_device_type": 2 00:06:45.690 } 00:06:45.690 ], 00:06:45.690 "driver_specific": {} 00:06:45.690 }, 00:06:45.690 { 00:06:45.690 "name": "Passthru0", 00:06:45.690 "aliases": [ 00:06:45.690 "b3d592f6-ff77-54ea-9efd-e56f31370f1e" 00:06:45.690 ], 00:06:45.690 "product_name": "passthru", 00:06:45.690 "block_size": 512, 00:06:45.690 "num_blocks": 16384, 00:06:45.690 "uuid": "b3d592f6-ff77-54ea-9efd-e56f31370f1e", 00:06:45.690 "assigned_rate_limits": { 00:06:45.690 "rw_ios_per_sec": 0, 00:06:45.690 "rw_mbytes_per_sec": 0, 00:06:45.690 "r_mbytes_per_sec": 0, 00:06:45.690 "w_mbytes_per_sec": 0 00:06:45.690 }, 00:06:45.690 "claimed": false, 00:06:45.690 "zoned": false, 00:06:45.690 "supported_io_types": { 00:06:45.690 "read": true, 00:06:45.690 "write": true, 00:06:45.690 "unmap": true, 00:06:45.690 "write_zeroes": true, 00:06:45.690 "flush": true, 00:06:45.690 "reset": true, 00:06:45.690 "compare": false, 00:06:45.690 "compare_and_write": false, 00:06:45.690 "abort": true, 00:06:45.690 "nvme_admin": false, 00:06:45.690 "nvme_io": false 00:06:45.690 }, 00:06:45.690 "memory_domains": [ 00:06:45.690 { 00:06:45.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:45.691 "dma_device_type": 2 00:06:45.691 } 00:06:45.691 ], 00:06:45.691 "driver_specific": { 00:06:45.691 "passthru": { 00:06:45.691 "name": "Passthru0", 00:06:45.691 "base_bdev_name": "Malloc0" 00:06:45.691 } 00:06:45.691 } 00:06:45.691 } 00:06:45.691 ]' 00:06:45.691 08:24:38 -- rpc/rpc.sh@21 -- # jq length 00:06:45.691 08:24:38 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:45.691 08:24:38 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:45.691 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.691 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.691 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.691 08:24:38 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:45.691 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.691 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.691 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.691 08:24:38 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:45.691 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.691 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.691 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.691 08:24:38 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:45.691 08:24:38 -- rpc/rpc.sh@26 -- # jq length 00:06:45.949 08:24:38 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:45.949 00:06:45.949 real 0m0.257s 00:06:45.949 user 0m0.153s 00:06:45.949 sys 0m0.040s 00:06:45.949 08:24:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.949 ************************************ 00:06:45.949 END TEST rpc_integrity 00:06:45.949 ************************************ 00:06:45.949 08:24:38 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:45.949 08:24:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:45.949 08:24:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.949 ************************************ 00:06:45.949 START TEST rpc_plugins 00:06:45.949 ************************************ 00:06:45.949 08:24:38 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:06:45.949 08:24:38 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:45.949 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.949 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.949 08:24:38 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:45.949 08:24:38 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:45.949 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.949 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.949 08:24:38 -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:45.949 { 00:06:45.949 "name": "Malloc1", 00:06:45.949 "aliases": [ 00:06:45.949 "21c1f66d-0823-49c0-8113-3b4d3d75bab1" 00:06:45.949 ], 00:06:45.949 "product_name": "Malloc disk", 00:06:45.949 "block_size": 4096, 00:06:45.949 "num_blocks": 256, 00:06:45.949 "uuid": "21c1f66d-0823-49c0-8113-3b4d3d75bab1", 00:06:45.949 "assigned_rate_limits": { 00:06:45.949 "rw_ios_per_sec": 0, 00:06:45.949 "rw_mbytes_per_sec": 0, 00:06:45.949 "r_mbytes_per_sec": 0, 00:06:45.949 "w_mbytes_per_sec": 0 00:06:45.949 }, 00:06:45.949 "claimed": false, 00:06:45.949 "zoned": false, 00:06:45.949 "supported_io_types": { 00:06:45.949 "read": true, 00:06:45.949 "write": true, 00:06:45.949 "unmap": true, 00:06:45.949 "write_zeroes": true, 00:06:45.949 "flush": true, 00:06:45.949 "reset": true, 00:06:45.949 "compare": false, 00:06:45.949 "compare_and_write": false, 00:06:45.949 "abort": true, 00:06:45.949 "nvme_admin": false, 00:06:45.949 "nvme_io": false 00:06:45.949 }, 00:06:45.949 "memory_domains": [ 00:06:45.949 { 00:06:45.949 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:45.949 "dma_device_type": 2 00:06:45.949 } 00:06:45.949 ], 00:06:45.949 "driver_specific": {} 00:06:45.949 } 00:06:45.949 ]' 00:06:45.949 08:24:38 -- rpc/rpc.sh@32 -- # jq length 00:06:45.949 08:24:38 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:45.949 08:24:38 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:45.949 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.949 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.949 08:24:38 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:45.949 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.949 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:45.949 08:24:38 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:45.949 08:24:38 -- rpc/rpc.sh@36 -- # jq length 00:06:45.949 08:24:38 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:45.949 00:06:45.949 real 0m0.141s 00:06:45.949 user 0m0.089s 00:06:45.949 sys 0m0.019s 00:06:45.949 08:24:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.949 ************************************ 00:06:45.949 END TEST rpc_plugins 00:06:45.949 ************************************ 00:06:45.949 08:24:38 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:45.949 08:24:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:45.949 08:24:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:45.949 ************************************ 00:06:45.949 START TEST rpc_trace_cmd_test 00:06:45.949 ************************************ 00:06:45.949 08:24:38 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:06:45.949 08:24:38 -- rpc/rpc.sh@40 -- # local info 00:06:45.949 08:24:38 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:45.949 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:45.949 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:46.207 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.207 08:24:38 -- rpc/rpc.sh@42 -- # info='{ 00:06:46.207 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid990359", 00:06:46.207 "tpoint_group_mask": "0x8", 00:06:46.207 "iscsi_conn": { 00:06:46.207 "mask": "0x2", 00:06:46.207 "tpoint_mask": "0x0" 00:06:46.207 }, 00:06:46.207 "scsi": { 00:06:46.207 "mask": "0x4", 00:06:46.207 "tpoint_mask": "0x0" 00:06:46.207 }, 00:06:46.207 "bdev": { 00:06:46.207 "mask": "0x8", 00:06:46.207 "tpoint_mask": "0xffffffffffffffff" 00:06:46.207 }, 00:06:46.207 "nvmf_rdma": { 00:06:46.207 "mask": "0x10", 00:06:46.207 "tpoint_mask": "0x0" 00:06:46.207 }, 00:06:46.207 "nvmf_tcp": { 00:06:46.207 "mask": "0x20", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 }, 00:06:46.208 "ftl": { 00:06:46.208 "mask": "0x40", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 }, 00:06:46.208 "blobfs": { 00:06:46.208 "mask": "0x80", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 }, 00:06:46.208 "dsa": { 00:06:46.208 "mask": "0x200", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 }, 00:06:46.208 "thread": { 00:06:46.208 "mask": "0x400", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 }, 00:06:46.208 "nvme_pcie": { 00:06:46.208 "mask": "0x800", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 }, 00:06:46.208 "iaa": { 00:06:46.208 "mask": "0x1000", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 }, 00:06:46.208 "nvme_tcp": { 00:06:46.208 "mask": "0x2000", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 }, 00:06:46.208 "bdev_nvme": { 00:06:46.208 "mask": "0x4000", 00:06:46.208 "tpoint_mask": "0x0" 00:06:46.208 } 00:06:46.208 }' 00:06:46.208 08:24:38 -- rpc/rpc.sh@43 -- # jq length 00:06:46.208 08:24:38 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:06:46.208 08:24:38 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:46.208 08:24:38 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:46.208 08:24:38 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:46.208 08:24:38 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:46.208 08:24:38 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:46.208 08:24:38 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:46.208 08:24:38 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:46.208 08:24:38 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:46.208 00:06:46.208 real 0m0.223s 00:06:46.208 user 0m0.180s 00:06:46.208 sys 0m0.033s 00:06:46.208 08:24:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.208 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:46.208 ************************************ 00:06:46.208 END TEST rpc_trace_cmd_test 00:06:46.208 ************************************ 00:06:46.208 08:24:38 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:46.208 08:24:38 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:46.208 08:24:38 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:46.208 08:24:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:46.208 08:24:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:46.208 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:46.208 ************************************ 00:06:46.208 START TEST rpc_daemon_integrity 00:06:46.208 ************************************ 00:06:46.208 08:24:38 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:06:46.208 08:24:38 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:46.208 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:46.208 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:46.466 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.466 08:24:38 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:46.466 08:24:38 -- rpc/rpc.sh@13 -- # jq length 00:06:46.466 08:24:38 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:46.466 08:24:38 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:46.466 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:46.466 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:46.466 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.466 08:24:38 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:46.466 08:24:38 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:46.466 08:24:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:46.466 08:24:38 -- common/autotest_common.sh@10 -- # set +x 00:06:46.466 08:24:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.466 08:24:38 -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:46.466 { 00:06:46.466 "name": "Malloc2", 00:06:46.466 "aliases": [ 00:06:46.466 "108c150d-37f7-4d49-9741-caafb4a6ce60" 00:06:46.466 ], 00:06:46.466 "product_name": "Malloc disk", 00:06:46.466 "block_size": 512, 00:06:46.466 "num_blocks": 16384, 00:06:46.466 "uuid": "108c150d-37f7-4d49-9741-caafb4a6ce60", 00:06:46.466 "assigned_rate_limits": { 00:06:46.466 "rw_ios_per_sec": 0, 00:06:46.466 "rw_mbytes_per_sec": 0, 00:06:46.466 "r_mbytes_per_sec": 0, 00:06:46.466 "w_mbytes_per_sec": 0 00:06:46.466 }, 00:06:46.466 "claimed": false, 00:06:46.466 "zoned": false, 00:06:46.466 "supported_io_types": { 00:06:46.466 "read": true, 00:06:46.466 "write": true, 00:06:46.466 "unmap": true, 00:06:46.466 "write_zeroes": true, 00:06:46.466 "flush": true, 00:06:46.466 "reset": true, 00:06:46.466 "compare": false, 00:06:46.466 "compare_and_write": false, 00:06:46.466 "abort": true, 00:06:46.466 "nvme_admin": false, 00:06:46.466 "nvme_io": false 00:06:46.466 }, 00:06:46.466 "memory_domains": [ 00:06:46.466 { 00:06:46.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.466 "dma_device_type": 2 00:06:46.466 } 00:06:46.466 ], 00:06:46.466 "driver_specific": {} 00:06:46.466 } 00:06:46.466 ]' 00:06:46.466 08:24:38 -- rpc/rpc.sh@17 -- # jq length 00:06:46.466 08:24:39 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:46.466 08:24:39 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:46.466 08:24:39 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:46.466 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.466 [2024-10-04 08:24:39.021076] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:46.466 [2024-10-04 08:24:39.021108] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:46.466 [2024-10-04 08:24:39.021123] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x56c54c0 00:06:46.466 [2024-10-04 08:24:39.021133] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:46.466 [2024-10-04 08:24:39.021814] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:46.466 [2024-10-04 08:24:39.021836] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:46.466 Passthru0 00:06:46.467 08:24:39 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.467 08:24:39 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:46.467 08:24:39 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:46.467 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.467 08:24:39 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.467 08:24:39 -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:46.467 { 00:06:46.467 "name": "Malloc2", 00:06:46.467 "aliases": [ 00:06:46.467 "108c150d-37f7-4d49-9741-caafb4a6ce60" 00:06:46.467 ], 00:06:46.467 "product_name": "Malloc disk", 00:06:46.467 "block_size": 512, 00:06:46.467 "num_blocks": 16384, 00:06:46.467 "uuid": "108c150d-37f7-4d49-9741-caafb4a6ce60", 00:06:46.467 "assigned_rate_limits": { 00:06:46.467 "rw_ios_per_sec": 0, 00:06:46.467 "rw_mbytes_per_sec": 0, 00:06:46.467 "r_mbytes_per_sec": 0, 00:06:46.467 "w_mbytes_per_sec": 0 00:06:46.467 }, 00:06:46.467 "claimed": true, 00:06:46.467 "claim_type": "exclusive_write", 00:06:46.467 "zoned": false, 00:06:46.467 "supported_io_types": { 00:06:46.467 "read": true, 00:06:46.467 "write": true, 00:06:46.467 "unmap": true, 00:06:46.467 "write_zeroes": true, 00:06:46.467 "flush": true, 00:06:46.467 "reset": true, 00:06:46.467 "compare": false, 00:06:46.467 "compare_and_write": false, 00:06:46.467 "abort": true, 00:06:46.467 "nvme_admin": false, 00:06:46.467 "nvme_io": false 00:06:46.467 }, 00:06:46.467 "memory_domains": [ 00:06:46.467 { 00:06:46.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.467 "dma_device_type": 2 00:06:46.467 } 00:06:46.467 ], 00:06:46.467 "driver_specific": {} 00:06:46.467 }, 00:06:46.467 { 00:06:46.467 "name": "Passthru0", 00:06:46.467 "aliases": [ 00:06:46.467 "6abd396a-5e54-5d0a-898e-d0f7f976dc10" 00:06:46.467 ], 00:06:46.467 "product_name": "passthru", 00:06:46.467 "block_size": 512, 00:06:46.467 "num_blocks": 16384, 00:06:46.467 "uuid": "6abd396a-5e54-5d0a-898e-d0f7f976dc10", 00:06:46.467 "assigned_rate_limits": { 00:06:46.467 "rw_ios_per_sec": 0, 00:06:46.467 "rw_mbytes_per_sec": 0, 00:06:46.467 "r_mbytes_per_sec": 0, 00:06:46.467 "w_mbytes_per_sec": 0 00:06:46.467 }, 00:06:46.467 "claimed": false, 00:06:46.467 "zoned": false, 00:06:46.467 "supported_io_types": { 00:06:46.467 "read": true, 00:06:46.467 "write": true, 00:06:46.467 "unmap": true, 00:06:46.467 "write_zeroes": true, 00:06:46.467 "flush": true, 00:06:46.467 "reset": true, 00:06:46.467 "compare": false, 00:06:46.467 "compare_and_write": false, 00:06:46.467 "abort": true, 00:06:46.467 "nvme_admin": false, 00:06:46.467 "nvme_io": false 00:06:46.467 }, 00:06:46.467 "memory_domains": [ 00:06:46.467 { 00:06:46.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.467 "dma_device_type": 2 00:06:46.467 } 00:06:46.467 ], 00:06:46.467 "driver_specific": { 00:06:46.467 "passthru": { 00:06:46.467 "name": "Passthru0", 00:06:46.467 "base_bdev_name": "Malloc2" 00:06:46.467 } 00:06:46.467 } 00:06:46.467 } 00:06:46.467 ]' 00:06:46.467 08:24:39 -- rpc/rpc.sh@21 -- # jq length 00:06:46.467 08:24:39 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:46.467 08:24:39 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:46.467 08:24:39 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:46.467 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.467 08:24:39 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.467 08:24:39 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:46.467 08:24:39 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:46.467 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.467 08:24:39 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.467 08:24:39 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:46.467 08:24:39 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:46.467 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.467 08:24:39 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:46.467 08:24:39 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:46.467 08:24:39 -- rpc/rpc.sh@26 -- # jq length 00:06:46.467 08:24:39 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:46.467 00:06:46.467 real 0m0.264s 00:06:46.467 user 0m0.162s 00:06:46.467 sys 0m0.036s 00:06:46.467 08:24:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.467 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.467 ************************************ 00:06:46.467 END TEST rpc_daemon_integrity 00:06:46.467 ************************************ 00:06:46.725 08:24:39 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:46.725 08:24:39 -- rpc/rpc.sh@84 -- # killprocess 990359 00:06:46.725 08:24:39 -- common/autotest_common.sh@926 -- # '[' -z 990359 ']' 00:06:46.725 08:24:39 -- common/autotest_common.sh@930 -- # kill -0 990359 00:06:46.725 08:24:39 -- common/autotest_common.sh@931 -- # uname 00:06:46.725 08:24:39 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:46.725 08:24:39 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 990359 00:06:46.725 08:24:39 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:46.725 08:24:39 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:46.725 08:24:39 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 990359' 00:06:46.725 killing process with pid 990359 00:06:46.725 08:24:39 -- common/autotest_common.sh@945 -- # kill 990359 00:06:46.725 08:24:39 -- common/autotest_common.sh@950 -- # wait 990359 00:06:46.984 00:06:46.984 real 0m2.359s 00:06:46.984 user 0m2.979s 00:06:46.984 sys 0m0.693s 00:06:46.984 08:24:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.984 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.984 ************************************ 00:06:46.984 END TEST rpc 00:06:46.984 ************************************ 00:06:46.984 08:24:39 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:46.984 08:24:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:46.984 08:24:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:46.984 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:46.984 ************************************ 00:06:46.984 START TEST rpc_client 00:06:46.984 ************************************ 00:06:46.984 08:24:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:47.242 * Looking for test storage... 00:06:47.242 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:47.242 08:24:39 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:47.242 OK 00:06:47.242 08:24:39 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:47.242 00:06:47.242 real 0m0.115s 00:06:47.242 user 0m0.049s 00:06:47.242 sys 0m0.076s 00:06:47.242 08:24:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.242 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:47.242 ************************************ 00:06:47.242 END TEST rpc_client 00:06:47.242 ************************************ 00:06:47.242 08:24:39 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:47.242 08:24:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:47.242 08:24:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:47.242 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:47.242 ************************************ 00:06:47.242 START TEST json_config 00:06:47.242 ************************************ 00:06:47.242 08:24:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:47.242 08:24:39 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:47.242 08:24:39 -- nvmf/common.sh@7 -- # uname -s 00:06:47.242 08:24:39 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:47.242 08:24:39 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:47.242 08:24:39 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:47.242 08:24:39 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:47.242 08:24:39 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:47.242 08:24:39 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:47.242 08:24:39 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:47.242 08:24:39 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:47.242 08:24:39 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:47.242 08:24:39 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:47.243 08:24:39 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:47.243 08:24:39 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:47.243 08:24:39 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:47.243 08:24:39 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:47.243 08:24:39 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:47.243 08:24:39 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:47.243 08:24:39 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:47.243 08:24:39 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:47.243 08:24:39 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:47.243 08:24:39 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.243 08:24:39 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.243 08:24:39 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.243 08:24:39 -- paths/export.sh@5 -- # export PATH 00:06:47.243 08:24:39 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.243 08:24:39 -- nvmf/common.sh@46 -- # : 0 00:06:47.243 08:24:39 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:06:47.243 08:24:39 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:06:47.243 08:24:39 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:06:47.243 08:24:39 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:47.243 08:24:39 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:47.243 08:24:39 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:06:47.243 08:24:39 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:06:47.243 08:24:39 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:06:47.243 08:24:39 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:06:47.243 08:24:39 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:06:47.243 08:24:39 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:06:47.243 08:24:39 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:47.243 08:24:39 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:47.243 WARNING: No tests are enabled so not running JSON configuration tests 00:06:47.243 08:24:39 -- json_config/json_config.sh@27 -- # exit 0 00:06:47.243 00:06:47.243 real 0m0.100s 00:06:47.243 user 0m0.048s 00:06:47.243 sys 0m0.053s 00:06:47.243 08:24:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.243 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:47.243 ************************************ 00:06:47.243 END TEST json_config 00:06:47.243 ************************************ 00:06:47.243 08:24:39 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:47.243 08:24:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:47.243 08:24:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:47.243 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:47.243 ************************************ 00:06:47.243 START TEST json_config_extra_key 00:06:47.243 ************************************ 00:06:47.243 08:24:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:47.501 08:24:39 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:47.501 08:24:39 -- nvmf/common.sh@7 -- # uname -s 00:06:47.501 08:24:39 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:47.501 08:24:39 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:47.501 08:24:39 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:47.501 08:24:39 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:47.501 08:24:39 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:47.501 08:24:39 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:47.501 08:24:39 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:47.501 08:24:39 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:47.501 08:24:39 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:47.501 08:24:39 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:47.501 08:24:39 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:47.501 08:24:39 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:47.501 08:24:39 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:47.501 08:24:39 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:47.501 08:24:39 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:47.501 08:24:39 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:47.501 08:24:39 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:47.501 08:24:39 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:47.501 08:24:39 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:47.501 08:24:39 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.501 08:24:39 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.501 08:24:39 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.501 08:24:39 -- paths/export.sh@5 -- # export PATH 00:06:47.501 08:24:39 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.501 08:24:39 -- nvmf/common.sh@46 -- # : 0 00:06:47.501 08:24:39 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:06:47.502 08:24:39 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:06:47.502 08:24:39 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:06:47.502 08:24:39 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:47.502 08:24:39 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:47.502 08:24:39 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:06:47.502 08:24:39 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:06:47.502 08:24:39 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:06:47.502 INFO: launching applications... 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@25 -- # shift 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=991014 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:06:47.502 Waiting for target to run... 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 991014 /var/tmp/spdk_tgt.sock 00:06:47.502 08:24:39 -- common/autotest_common.sh@819 -- # '[' -z 991014 ']' 00:06:47.502 08:24:39 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:47.502 08:24:39 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:47.502 08:24:39 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:47.502 08:24:39 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:47.502 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:47.502 08:24:39 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:47.502 08:24:39 -- common/autotest_common.sh@10 -- # set +x 00:06:47.502 [2024-10-04 08:24:40.019084] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:47.502 [2024-10-04 08:24:40.019170] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid991014 ] 00:06:47.502 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.760 [2024-10-04 08:24:40.297956] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.760 [2024-10-04 08:24:40.316886] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:47.760 [2024-10-04 08:24:40.316988] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.327 08:24:40 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:48.327 08:24:40 -- common/autotest_common.sh@852 -- # return 0 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:06:48.327 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:06:48.327 INFO: shutting down applications... 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 991014 ]] 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 991014 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@50 -- # kill -0 991014 00:06:48.327 08:24:40 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:06:48.895 08:24:41 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:06:48.895 08:24:41 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:06:48.895 08:24:41 -- json_config/json_config_extra_key.sh@50 -- # kill -0 991014 00:06:48.895 08:24:41 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:06:48.895 08:24:41 -- json_config/json_config_extra_key.sh@52 -- # break 00:06:48.895 08:24:41 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:06:48.895 08:24:41 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:06:48.895 SPDK target shutdown done 00:06:48.895 08:24:41 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:06:48.895 Success 00:06:48.895 00:06:48.895 real 0m1.474s 00:06:48.895 user 0m1.222s 00:06:48.895 sys 0m0.399s 00:06:48.895 08:24:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.895 08:24:41 -- common/autotest_common.sh@10 -- # set +x 00:06:48.895 ************************************ 00:06:48.895 END TEST json_config_extra_key 00:06:48.895 ************************************ 00:06:48.895 08:24:41 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:48.895 08:24:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:48.895 08:24:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:48.895 08:24:41 -- common/autotest_common.sh@10 -- # set +x 00:06:48.895 ************************************ 00:06:48.895 START TEST alias_rpc 00:06:48.895 ************************************ 00:06:48.895 08:24:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:48.895 * Looking for test storage... 00:06:48.895 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:48.895 08:24:41 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:48.895 08:24:41 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=991338 00:06:48.895 08:24:41 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 991338 00:06:48.895 08:24:41 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.895 08:24:41 -- common/autotest_common.sh@819 -- # '[' -z 991338 ']' 00:06:48.895 08:24:41 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.895 08:24:41 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:48.895 08:24:41 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.895 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.895 08:24:41 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:48.895 08:24:41 -- common/autotest_common.sh@10 -- # set +x 00:06:48.895 [2024-10-04 08:24:41.536501] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:48.896 [2024-10-04 08:24:41.536584] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid991338 ] 00:06:48.896 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.154 [2024-10-04 08:24:41.604953] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.154 [2024-10-04 08:24:41.641709] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:49.154 [2024-10-04 08:24:41.641821] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.720 08:24:42 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:49.720 08:24:42 -- common/autotest_common.sh@852 -- # return 0 00:06:49.720 08:24:42 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:49.978 08:24:42 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 991338 00:06:49.978 08:24:42 -- common/autotest_common.sh@926 -- # '[' -z 991338 ']' 00:06:49.978 08:24:42 -- common/autotest_common.sh@930 -- # kill -0 991338 00:06:49.978 08:24:42 -- common/autotest_common.sh@931 -- # uname 00:06:49.978 08:24:42 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:49.978 08:24:42 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 991338 00:06:49.978 08:24:42 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:49.978 08:24:42 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:49.978 08:24:42 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 991338' 00:06:49.978 killing process with pid 991338 00:06:49.978 08:24:42 -- common/autotest_common.sh@945 -- # kill 991338 00:06:49.978 08:24:42 -- common/autotest_common.sh@950 -- # wait 991338 00:06:50.237 00:06:50.237 real 0m1.491s 00:06:50.237 user 0m1.608s 00:06:50.237 sys 0m0.435s 00:06:50.237 08:24:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.237 08:24:42 -- common/autotest_common.sh@10 -- # set +x 00:06:50.237 ************************************ 00:06:50.237 END TEST alias_rpc 00:06:50.237 ************************************ 00:06:50.495 08:24:42 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:06:50.495 08:24:42 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:50.495 08:24:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:50.495 08:24:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:50.495 08:24:42 -- common/autotest_common.sh@10 -- # set +x 00:06:50.495 ************************************ 00:06:50.495 START TEST spdkcli_tcp 00:06:50.495 ************************************ 00:06:50.495 08:24:42 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:50.495 * Looking for test storage... 00:06:50.495 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:50.495 08:24:43 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:50.495 08:24:43 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:50.495 08:24:43 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:50.495 08:24:43 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:50.495 08:24:43 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:50.495 08:24:43 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:50.495 08:24:43 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:50.495 08:24:43 -- common/autotest_common.sh@712 -- # xtrace_disable 00:06:50.495 08:24:43 -- common/autotest_common.sh@10 -- # set +x 00:06:50.495 08:24:43 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=991659 00:06:50.495 08:24:43 -- spdkcli/tcp.sh@27 -- # waitforlisten 991659 00:06:50.495 08:24:43 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:50.495 08:24:43 -- common/autotest_common.sh@819 -- # '[' -z 991659 ']' 00:06:50.495 08:24:43 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.495 08:24:43 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:50.495 08:24:43 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.495 08:24:43 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:50.495 08:24:43 -- common/autotest_common.sh@10 -- # set +x 00:06:50.495 [2024-10-04 08:24:43.087666] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:50.495 [2024-10-04 08:24:43.087744] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid991659 ] 00:06:50.495 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.495 [2024-10-04 08:24:43.155692] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:50.753 [2024-10-04 08:24:43.192359] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:50.753 [2024-10-04 08:24:43.192513] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:50.753 [2024-10-04 08:24:43.192515] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.319 08:24:43 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:51.319 08:24:43 -- common/autotest_common.sh@852 -- # return 0 00:06:51.319 08:24:43 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:51.319 08:24:43 -- spdkcli/tcp.sh@31 -- # socat_pid=991793 00:06:51.319 08:24:43 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:51.585 [ 00:06:51.585 "spdk_get_version", 00:06:51.585 "rpc_get_methods", 00:06:51.585 "trace_get_info", 00:06:51.585 "trace_get_tpoint_group_mask", 00:06:51.585 "trace_disable_tpoint_group", 00:06:51.585 "trace_enable_tpoint_group", 00:06:51.585 "trace_clear_tpoint_mask", 00:06:51.585 "trace_set_tpoint_mask", 00:06:51.585 "vfu_tgt_set_base_path", 00:06:51.585 "framework_get_pci_devices", 00:06:51.585 "framework_get_config", 00:06:51.585 "framework_get_subsystems", 00:06:51.585 "iobuf_get_stats", 00:06:51.585 "iobuf_set_options", 00:06:51.585 "sock_set_default_impl", 00:06:51.585 "sock_impl_set_options", 00:06:51.585 "sock_impl_get_options", 00:06:51.585 "vmd_rescan", 00:06:51.585 "vmd_remove_device", 00:06:51.585 "vmd_enable", 00:06:51.585 "accel_get_stats", 00:06:51.585 "accel_set_options", 00:06:51.585 "accel_set_driver", 00:06:51.585 "accel_crypto_key_destroy", 00:06:51.585 "accel_crypto_keys_get", 00:06:51.585 "accel_crypto_key_create", 00:06:51.585 "accel_assign_opc", 00:06:51.585 "accel_get_module_info", 00:06:51.585 "accel_get_opc_assignments", 00:06:51.585 "notify_get_notifications", 00:06:51.585 "notify_get_types", 00:06:51.585 "bdev_get_histogram", 00:06:51.585 "bdev_enable_histogram", 00:06:51.585 "bdev_set_qos_limit", 00:06:51.585 "bdev_set_qd_sampling_period", 00:06:51.585 "bdev_get_bdevs", 00:06:51.585 "bdev_reset_iostat", 00:06:51.585 "bdev_get_iostat", 00:06:51.585 "bdev_examine", 00:06:51.585 "bdev_wait_for_examine", 00:06:51.585 "bdev_set_options", 00:06:51.585 "scsi_get_devices", 00:06:51.585 "thread_set_cpumask", 00:06:51.585 "framework_get_scheduler", 00:06:51.585 "framework_set_scheduler", 00:06:51.585 "framework_get_reactors", 00:06:51.585 "thread_get_io_channels", 00:06:51.585 "thread_get_pollers", 00:06:51.585 "thread_get_stats", 00:06:51.585 "framework_monitor_context_switch", 00:06:51.585 "spdk_kill_instance", 00:06:51.585 "log_enable_timestamps", 00:06:51.585 "log_get_flags", 00:06:51.585 "log_clear_flag", 00:06:51.585 "log_set_flag", 00:06:51.585 "log_get_level", 00:06:51.585 "log_set_level", 00:06:51.585 "log_get_print_level", 00:06:51.585 "log_set_print_level", 00:06:51.585 "framework_enable_cpumask_locks", 00:06:51.585 "framework_disable_cpumask_locks", 00:06:51.585 "framework_wait_init", 00:06:51.585 "framework_start_init", 00:06:51.585 "virtio_blk_create_transport", 00:06:51.585 "virtio_blk_get_transports", 00:06:51.585 "vhost_controller_set_coalescing", 00:06:51.585 "vhost_get_controllers", 00:06:51.586 "vhost_delete_controller", 00:06:51.586 "vhost_create_blk_controller", 00:06:51.586 "vhost_scsi_controller_remove_target", 00:06:51.586 "vhost_scsi_controller_add_target", 00:06:51.586 "vhost_start_scsi_controller", 00:06:51.586 "vhost_create_scsi_controller", 00:06:51.586 "ublk_recover_disk", 00:06:51.586 "ublk_get_disks", 00:06:51.586 "ublk_stop_disk", 00:06:51.586 "ublk_start_disk", 00:06:51.586 "ublk_destroy_target", 00:06:51.586 "ublk_create_target", 00:06:51.586 "nbd_get_disks", 00:06:51.586 "nbd_stop_disk", 00:06:51.586 "nbd_start_disk", 00:06:51.586 "env_dpdk_get_mem_stats", 00:06:51.586 "nvmf_subsystem_get_listeners", 00:06:51.586 "nvmf_subsystem_get_qpairs", 00:06:51.586 "nvmf_subsystem_get_controllers", 00:06:51.586 "nvmf_get_stats", 00:06:51.586 "nvmf_get_transports", 00:06:51.586 "nvmf_create_transport", 00:06:51.586 "nvmf_get_targets", 00:06:51.586 "nvmf_delete_target", 00:06:51.586 "nvmf_create_target", 00:06:51.586 "nvmf_subsystem_allow_any_host", 00:06:51.586 "nvmf_subsystem_remove_host", 00:06:51.586 "nvmf_subsystem_add_host", 00:06:51.586 "nvmf_subsystem_remove_ns", 00:06:51.586 "nvmf_subsystem_add_ns", 00:06:51.586 "nvmf_subsystem_listener_set_ana_state", 00:06:51.586 "nvmf_discovery_get_referrals", 00:06:51.586 "nvmf_discovery_remove_referral", 00:06:51.586 "nvmf_discovery_add_referral", 00:06:51.586 "nvmf_subsystem_remove_listener", 00:06:51.586 "nvmf_subsystem_add_listener", 00:06:51.586 "nvmf_delete_subsystem", 00:06:51.586 "nvmf_create_subsystem", 00:06:51.586 "nvmf_get_subsystems", 00:06:51.586 "nvmf_set_crdt", 00:06:51.586 "nvmf_set_config", 00:06:51.586 "nvmf_set_max_subsystems", 00:06:51.586 "iscsi_set_options", 00:06:51.586 "iscsi_get_auth_groups", 00:06:51.586 "iscsi_auth_group_remove_secret", 00:06:51.586 "iscsi_auth_group_add_secret", 00:06:51.586 "iscsi_delete_auth_group", 00:06:51.586 "iscsi_create_auth_group", 00:06:51.586 "iscsi_set_discovery_auth", 00:06:51.586 "iscsi_get_options", 00:06:51.586 "iscsi_target_node_request_logout", 00:06:51.586 "iscsi_target_node_set_redirect", 00:06:51.586 "iscsi_target_node_set_auth", 00:06:51.586 "iscsi_target_node_add_lun", 00:06:51.586 "iscsi_get_connections", 00:06:51.586 "iscsi_portal_group_set_auth", 00:06:51.586 "iscsi_start_portal_group", 00:06:51.586 "iscsi_delete_portal_group", 00:06:51.586 "iscsi_create_portal_group", 00:06:51.586 "iscsi_get_portal_groups", 00:06:51.586 "iscsi_delete_target_node", 00:06:51.586 "iscsi_target_node_remove_pg_ig_maps", 00:06:51.586 "iscsi_target_node_add_pg_ig_maps", 00:06:51.586 "iscsi_create_target_node", 00:06:51.586 "iscsi_get_target_nodes", 00:06:51.586 "iscsi_delete_initiator_group", 00:06:51.586 "iscsi_initiator_group_remove_initiators", 00:06:51.586 "iscsi_initiator_group_add_initiators", 00:06:51.586 "iscsi_create_initiator_group", 00:06:51.586 "iscsi_get_initiator_groups", 00:06:51.586 "vfu_virtio_create_scsi_endpoint", 00:06:51.586 "vfu_virtio_scsi_remove_target", 00:06:51.586 "vfu_virtio_scsi_add_target", 00:06:51.586 "vfu_virtio_create_blk_endpoint", 00:06:51.586 "vfu_virtio_delete_endpoint", 00:06:51.586 "iaa_scan_accel_module", 00:06:51.586 "dsa_scan_accel_module", 00:06:51.586 "ioat_scan_accel_module", 00:06:51.586 "accel_error_inject_error", 00:06:51.586 "bdev_iscsi_delete", 00:06:51.586 "bdev_iscsi_create", 00:06:51.586 "bdev_iscsi_set_options", 00:06:51.586 "bdev_virtio_attach_controller", 00:06:51.586 "bdev_virtio_scsi_get_devices", 00:06:51.586 "bdev_virtio_detach_controller", 00:06:51.586 "bdev_virtio_blk_set_hotplug", 00:06:51.586 "bdev_ftl_set_property", 00:06:51.586 "bdev_ftl_get_properties", 00:06:51.586 "bdev_ftl_get_stats", 00:06:51.586 "bdev_ftl_unmap", 00:06:51.586 "bdev_ftl_unload", 00:06:51.586 "bdev_ftl_delete", 00:06:51.586 "bdev_ftl_load", 00:06:51.586 "bdev_ftl_create", 00:06:51.586 "bdev_aio_delete", 00:06:51.586 "bdev_aio_rescan", 00:06:51.586 "bdev_aio_create", 00:06:51.586 "blobfs_create", 00:06:51.586 "blobfs_detect", 00:06:51.586 "blobfs_set_cache_size", 00:06:51.586 "bdev_zone_block_delete", 00:06:51.586 "bdev_zone_block_create", 00:06:51.586 "bdev_delay_delete", 00:06:51.586 "bdev_delay_create", 00:06:51.586 "bdev_delay_update_latency", 00:06:51.586 "bdev_split_delete", 00:06:51.586 "bdev_split_create", 00:06:51.586 "bdev_error_inject_error", 00:06:51.586 "bdev_error_delete", 00:06:51.586 "bdev_error_create", 00:06:51.586 "bdev_raid_set_options", 00:06:51.586 "bdev_raid_remove_base_bdev", 00:06:51.586 "bdev_raid_add_base_bdev", 00:06:51.586 "bdev_raid_delete", 00:06:51.586 "bdev_raid_create", 00:06:51.586 "bdev_raid_get_bdevs", 00:06:51.586 "bdev_lvol_grow_lvstore", 00:06:51.586 "bdev_lvol_get_lvols", 00:06:51.586 "bdev_lvol_get_lvstores", 00:06:51.586 "bdev_lvol_delete", 00:06:51.586 "bdev_lvol_set_read_only", 00:06:51.586 "bdev_lvol_resize", 00:06:51.586 "bdev_lvol_decouple_parent", 00:06:51.586 "bdev_lvol_inflate", 00:06:51.586 "bdev_lvol_rename", 00:06:51.586 "bdev_lvol_clone_bdev", 00:06:51.586 "bdev_lvol_clone", 00:06:51.586 "bdev_lvol_snapshot", 00:06:51.586 "bdev_lvol_create", 00:06:51.586 "bdev_lvol_delete_lvstore", 00:06:51.586 "bdev_lvol_rename_lvstore", 00:06:51.586 "bdev_lvol_create_lvstore", 00:06:51.586 "bdev_passthru_delete", 00:06:51.586 "bdev_passthru_create", 00:06:51.586 "bdev_nvme_cuse_unregister", 00:06:51.586 "bdev_nvme_cuse_register", 00:06:51.586 "bdev_opal_new_user", 00:06:51.586 "bdev_opal_set_lock_state", 00:06:51.586 "bdev_opal_delete", 00:06:51.586 "bdev_opal_get_info", 00:06:51.586 "bdev_opal_create", 00:06:51.586 "bdev_nvme_opal_revert", 00:06:51.586 "bdev_nvme_opal_init", 00:06:51.586 "bdev_nvme_send_cmd", 00:06:51.586 "bdev_nvme_get_path_iostat", 00:06:51.586 "bdev_nvme_get_mdns_discovery_info", 00:06:51.586 "bdev_nvme_stop_mdns_discovery", 00:06:51.586 "bdev_nvme_start_mdns_discovery", 00:06:51.586 "bdev_nvme_set_multipath_policy", 00:06:51.586 "bdev_nvme_set_preferred_path", 00:06:51.586 "bdev_nvme_get_io_paths", 00:06:51.586 "bdev_nvme_remove_error_injection", 00:06:51.586 "bdev_nvme_add_error_injection", 00:06:51.586 "bdev_nvme_get_discovery_info", 00:06:51.586 "bdev_nvme_stop_discovery", 00:06:51.586 "bdev_nvme_start_discovery", 00:06:51.586 "bdev_nvme_get_controller_health_info", 00:06:51.586 "bdev_nvme_disable_controller", 00:06:51.586 "bdev_nvme_enable_controller", 00:06:51.586 "bdev_nvme_reset_controller", 00:06:51.586 "bdev_nvme_get_transport_statistics", 00:06:51.586 "bdev_nvme_apply_firmware", 00:06:51.586 "bdev_nvme_detach_controller", 00:06:51.586 "bdev_nvme_get_controllers", 00:06:51.586 "bdev_nvme_attach_controller", 00:06:51.586 "bdev_nvme_set_hotplug", 00:06:51.586 "bdev_nvme_set_options", 00:06:51.586 "bdev_null_resize", 00:06:51.586 "bdev_null_delete", 00:06:51.586 "bdev_null_create", 00:06:51.586 "bdev_malloc_delete", 00:06:51.586 "bdev_malloc_create" 00:06:51.586 ] 00:06:51.586 08:24:44 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:51.586 08:24:44 -- common/autotest_common.sh@718 -- # xtrace_disable 00:06:51.586 08:24:44 -- common/autotest_common.sh@10 -- # set +x 00:06:51.586 08:24:44 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:51.586 08:24:44 -- spdkcli/tcp.sh@38 -- # killprocess 991659 00:06:51.586 08:24:44 -- common/autotest_common.sh@926 -- # '[' -z 991659 ']' 00:06:51.586 08:24:44 -- common/autotest_common.sh@930 -- # kill -0 991659 00:06:51.586 08:24:44 -- common/autotest_common.sh@931 -- # uname 00:06:51.586 08:24:44 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:51.586 08:24:44 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 991659 00:06:51.586 08:24:44 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:51.586 08:24:44 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:51.586 08:24:44 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 991659' 00:06:51.586 killing process with pid 991659 00:06:51.586 08:24:44 -- common/autotest_common.sh@945 -- # kill 991659 00:06:51.586 08:24:44 -- common/autotest_common.sh@950 -- # wait 991659 00:06:51.845 00:06:51.845 real 0m1.539s 00:06:51.845 user 0m2.929s 00:06:51.845 sys 0m0.467s 00:06:51.845 08:24:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.845 08:24:44 -- common/autotest_common.sh@10 -- # set +x 00:06:51.845 ************************************ 00:06:51.845 END TEST spdkcli_tcp 00:06:51.845 ************************************ 00:06:52.104 08:24:44 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:52.104 08:24:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:52.104 08:24:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:52.104 08:24:44 -- common/autotest_common.sh@10 -- # set +x 00:06:52.104 ************************************ 00:06:52.104 START TEST dpdk_mem_utility 00:06:52.104 ************************************ 00:06:52.104 08:24:44 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:52.104 * Looking for test storage... 00:06:52.104 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:52.104 08:24:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:52.104 08:24:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=992015 00:06:52.104 08:24:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 992015 00:06:52.104 08:24:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:52.104 08:24:44 -- common/autotest_common.sh@819 -- # '[' -z 992015 ']' 00:06:52.104 08:24:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:52.104 08:24:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:52.104 08:24:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:52.104 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:52.104 08:24:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:52.104 08:24:44 -- common/autotest_common.sh@10 -- # set +x 00:06:52.104 [2024-10-04 08:24:44.664322] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:52.104 [2024-10-04 08:24:44.664413] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid992015 ] 00:06:52.104 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.104 [2024-10-04 08:24:44.733297] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.104 [2024-10-04 08:24:44.770647] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:52.104 [2024-10-04 08:24:44.770758] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.038 08:24:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:53.038 08:24:45 -- common/autotest_common.sh@852 -- # return 0 00:06:53.038 08:24:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:53.038 08:24:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:53.038 08:24:45 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:53.038 08:24:45 -- common/autotest_common.sh@10 -- # set +x 00:06:53.038 { 00:06:53.038 "filename": "/tmp/spdk_mem_dump.txt" 00:06:53.038 } 00:06:53.038 08:24:45 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:53.038 08:24:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:53.038 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:53.038 1 heaps totaling size 814.000000 MiB 00:06:53.038 size: 814.000000 MiB heap id: 0 00:06:53.038 end heaps---------- 00:06:53.038 8 mempools totaling size 598.116089 MiB 00:06:53.038 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:53.038 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:53.038 size: 84.521057 MiB name: bdev_io_992015 00:06:53.038 size: 51.011292 MiB name: evtpool_992015 00:06:53.038 size: 50.003479 MiB name: msgpool_992015 00:06:53.038 size: 21.763794 MiB name: PDU_Pool 00:06:53.038 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:53.038 size: 0.026123 MiB name: Session_Pool 00:06:53.038 end mempools------- 00:06:53.038 6 memzones totaling size 4.142822 MiB 00:06:53.038 size: 1.000366 MiB name: RG_ring_0_992015 00:06:53.038 size: 1.000366 MiB name: RG_ring_1_992015 00:06:53.038 size: 1.000366 MiB name: RG_ring_4_992015 00:06:53.038 size: 1.000366 MiB name: RG_ring_5_992015 00:06:53.038 size: 0.125366 MiB name: RG_ring_2_992015 00:06:53.038 size: 0.015991 MiB name: RG_ring_3_992015 00:06:53.038 end memzones------- 00:06:53.039 08:24:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:53.039 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:06:53.039 list of free elements. size: 12.519348 MiB 00:06:53.039 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:53.039 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:53.039 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:53.039 element at address: 0x200003e00000 with size: 0.996277 MiB 00:06:53.039 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:53.039 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:53.039 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:53.039 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:53.039 element at address: 0x200000200000 with size: 0.841614 MiB 00:06:53.039 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:06:53.039 element at address: 0x20000b200000 with size: 0.490723 MiB 00:06:53.039 element at address: 0x200000800000 with size: 0.487793 MiB 00:06:53.039 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:53.039 element at address: 0x200027e00000 with size: 0.410034 MiB 00:06:53.039 element at address: 0x200003a00000 with size: 0.355530 MiB 00:06:53.039 list of standard malloc elements. size: 199.218079 MiB 00:06:53.039 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:53.039 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:53.039 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:53.039 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:53.039 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:53.039 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:53.039 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:53.039 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:53.039 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:53.039 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:53.039 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:53.039 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200003adb300 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200003adb500 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200003affa80 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200003affb40 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:53.039 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:53.039 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:53.039 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:53.039 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:53.039 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:53.039 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200027e69040 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:53.039 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:53.039 list of memzone associated elements. size: 602.262573 MiB 00:06:53.039 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:53.039 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:53.039 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:53.039 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:53.039 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:53.039 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_992015_0 00:06:53.039 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:53.039 associated memzone info: size: 48.002930 MiB name: MP_evtpool_992015_0 00:06:53.039 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:53.039 associated memzone info: size: 48.002930 MiB name: MP_msgpool_992015_0 00:06:53.039 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:53.039 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:53.039 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:53.039 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:53.039 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:53.039 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_992015 00:06:53.039 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:53.039 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_992015 00:06:53.039 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:53.039 associated memzone info: size: 1.007996 MiB name: MP_evtpool_992015 00:06:53.039 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:53.039 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:53.039 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:53.039 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:53.039 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:53.039 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:53.039 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:53.039 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:53.039 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:53.039 associated memzone info: size: 1.000366 MiB name: RG_ring_0_992015 00:06:53.039 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:53.039 associated memzone info: size: 1.000366 MiB name: RG_ring_1_992015 00:06:53.039 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:53.039 associated memzone info: size: 1.000366 MiB name: RG_ring_4_992015 00:06:53.039 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:53.039 associated memzone info: size: 1.000366 MiB name: RG_ring_5_992015 00:06:53.039 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:06:53.039 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_992015 00:06:53.039 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:53.039 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:53.039 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:53.039 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:53.039 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:53.039 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:53.039 element at address: 0x200003adf880 with size: 0.125488 MiB 00:06:53.039 associated memzone info: size: 0.125366 MiB name: RG_ring_2_992015 00:06:53.039 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:53.039 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:53.039 element at address: 0x200027e69100 with size: 0.023743 MiB 00:06:53.039 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:53.039 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:06:53.039 associated memzone info: size: 0.015991 MiB name: RG_ring_3_992015 00:06:53.039 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:06:53.039 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:53.039 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:06:53.039 associated memzone info: size: 0.000183 MiB name: MP_msgpool_992015 00:06:53.039 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:06:53.039 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_992015 00:06:53.039 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:06:53.039 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:53.039 08:24:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:53.039 08:24:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 992015 00:06:53.039 08:24:45 -- common/autotest_common.sh@926 -- # '[' -z 992015 ']' 00:06:53.039 08:24:45 -- common/autotest_common.sh@930 -- # kill -0 992015 00:06:53.039 08:24:45 -- common/autotest_common.sh@931 -- # uname 00:06:53.039 08:24:45 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:53.039 08:24:45 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 992015 00:06:53.039 08:24:45 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:53.039 08:24:45 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:53.039 08:24:45 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 992015' 00:06:53.039 killing process with pid 992015 00:06:53.039 08:24:45 -- common/autotest_common.sh@945 -- # kill 992015 00:06:53.039 08:24:45 -- common/autotest_common.sh@950 -- # wait 992015 00:06:53.298 00:06:53.298 real 0m1.408s 00:06:53.298 user 0m1.473s 00:06:53.298 sys 0m0.426s 00:06:53.298 08:24:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.298 08:24:45 -- common/autotest_common.sh@10 -- # set +x 00:06:53.298 ************************************ 00:06:53.298 END TEST dpdk_mem_utility 00:06:53.298 ************************************ 00:06:53.557 08:24:45 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:53.557 08:24:45 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:53.557 08:24:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:53.557 08:24:45 -- common/autotest_common.sh@10 -- # set +x 00:06:53.557 ************************************ 00:06:53.557 START TEST event 00:06:53.557 ************************************ 00:06:53.557 08:24:45 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:53.557 * Looking for test storage... 00:06:53.557 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:53.557 08:24:46 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:53.557 08:24:46 -- bdev/nbd_common.sh@6 -- # set -e 00:06:53.557 08:24:46 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:53.557 08:24:46 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:53.557 08:24:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:53.557 08:24:46 -- common/autotest_common.sh@10 -- # set +x 00:06:53.557 ************************************ 00:06:53.557 START TEST event_perf 00:06:53.557 ************************************ 00:06:53.557 08:24:46 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:53.557 Running I/O for 1 seconds...[2024-10-04 08:24:46.124694] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:53.557 [2024-10-04 08:24:46.124785] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid992298 ] 00:06:53.557 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.557 [2024-10-04 08:24:46.196237] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:53.557 [2024-10-04 08:24:46.235150] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:53.557 [2024-10-04 08:24:46.235244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:53.557 [2024-10-04 08:24:46.235262] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:53.557 [2024-10-04 08:24:46.235263] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.935 Running I/O for 1 seconds... 00:06:54.935 lcore 0: 198245 00:06:54.935 lcore 1: 198244 00:06:54.935 lcore 2: 198245 00:06:54.935 lcore 3: 198244 00:06:54.935 done. 00:06:54.935 00:06:54.935 real 0m1.183s 00:06:54.935 user 0m4.087s 00:06:54.935 sys 0m0.094s 00:06:54.935 08:24:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.935 08:24:47 -- common/autotest_common.sh@10 -- # set +x 00:06:54.935 ************************************ 00:06:54.935 END TEST event_perf 00:06:54.935 ************************************ 00:06:54.935 08:24:47 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:54.935 08:24:47 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:06:54.935 08:24:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:54.935 08:24:47 -- common/autotest_common.sh@10 -- # set +x 00:06:54.935 ************************************ 00:06:54.935 START TEST event_reactor 00:06:54.935 ************************************ 00:06:54.935 08:24:47 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:54.935 [2024-10-04 08:24:47.357126] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:54.935 [2024-10-04 08:24:47.357232] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid992472 ] 00:06:54.935 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.935 [2024-10-04 08:24:47.427101] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.935 [2024-10-04 08:24:47.462842] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.872 test_start 00:06:55.872 oneshot 00:06:55.872 tick 100 00:06:55.872 tick 100 00:06:55.872 tick 250 00:06:55.872 tick 100 00:06:55.872 tick 100 00:06:55.872 tick 100 00:06:55.872 tick 250 00:06:55.872 tick 500 00:06:55.872 tick 100 00:06:55.872 tick 100 00:06:55.872 tick 250 00:06:55.872 tick 100 00:06:55.872 tick 100 00:06:55.872 test_end 00:06:55.872 00:06:55.872 real 0m1.176s 00:06:55.872 user 0m1.083s 00:06:55.872 sys 0m0.088s 00:06:55.872 08:24:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:55.872 08:24:48 -- common/autotest_common.sh@10 -- # set +x 00:06:55.872 ************************************ 00:06:55.872 END TEST event_reactor 00:06:55.872 ************************************ 00:06:56.131 08:24:48 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:56.131 08:24:48 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:06:56.131 08:24:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:56.131 08:24:48 -- common/autotest_common.sh@10 -- # set +x 00:06:56.131 ************************************ 00:06:56.131 START TEST event_reactor_perf 00:06:56.131 ************************************ 00:06:56.131 08:24:48 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:56.131 [2024-10-04 08:24:48.583910] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:56.131 [2024-10-04 08:24:48.584023] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid992757 ] 00:06:56.131 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.131 [2024-10-04 08:24:48.656659] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.131 [2024-10-04 08:24:48.692666] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.068 test_start 00:06:57.068 test_end 00:06:57.068 Performance: 944789 events per second 00:06:57.068 00:06:57.068 real 0m1.180s 00:06:57.068 user 0m1.090s 00:06:57.068 sys 0m0.085s 00:06:57.068 08:24:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:57.068 08:24:49 -- common/autotest_common.sh@10 -- # set +x 00:06:57.068 ************************************ 00:06:57.068 END TEST event_reactor_perf 00:06:57.068 ************************************ 00:06:57.326 08:24:49 -- event/event.sh@49 -- # uname -s 00:06:57.326 08:24:49 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:57.326 08:24:49 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:57.326 08:24:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:57.326 08:24:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:57.326 08:24:49 -- common/autotest_common.sh@10 -- # set +x 00:06:57.326 ************************************ 00:06:57.326 START TEST event_scheduler 00:06:57.326 ************************************ 00:06:57.326 08:24:49 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:57.326 * Looking for test storage... 00:06:57.326 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:57.326 08:24:49 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:57.326 08:24:49 -- scheduler/scheduler.sh@35 -- # scheduler_pid=993072 00:06:57.326 08:24:49 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:57.326 08:24:49 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:57.326 08:24:49 -- scheduler/scheduler.sh@37 -- # waitforlisten 993072 00:06:57.326 08:24:49 -- common/autotest_common.sh@819 -- # '[' -z 993072 ']' 00:06:57.326 08:24:49 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:57.326 08:24:49 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:57.326 08:24:49 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:57.326 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:57.326 08:24:49 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:57.327 08:24:49 -- common/autotest_common.sh@10 -- # set +x 00:06:57.327 [2024-10-04 08:24:49.926689] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:57.327 [2024-10-04 08:24:49.926774] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid993072 ] 00:06:57.327 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.327 [2024-10-04 08:24:49.992126] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:57.585 [2024-10-04 08:24:50.032065] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.585 [2024-10-04 08:24:50.032146] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:57.585 [2024-10-04 08:24:50.032244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:57.585 [2024-10-04 08:24:50.032246] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:57.585 08:24:50 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:57.585 08:24:50 -- common/autotest_common.sh@852 -- # return 0 00:06:57.585 08:24:50 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:57.585 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.585 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.585 POWER: Env isn't set yet! 00:06:57.585 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:57.585 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:57.585 POWER: Cannot set governor of lcore 0 to userspace 00:06:57.585 POWER: Attempting to initialise PSTAT power management... 00:06:57.585 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:57.585 POWER: Initialized successfully for lcore 0 power management 00:06:57.585 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:57.585 POWER: Initialized successfully for lcore 1 power management 00:06:57.585 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:57.585 POWER: Initialized successfully for lcore 2 power management 00:06:57.585 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:57.585 POWER: Initialized successfully for lcore 3 power management 00:06:57.585 [2024-10-04 08:24:50.152360] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:57.585 [2024-10-04 08:24:50.152377] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:57.585 [2024-10-04 08:24:50.152388] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:57.585 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.585 08:24:50 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:57.585 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.585 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.585 [2024-10-04 08:24:50.214919] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:57.585 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.585 08:24:50 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:57.585 08:24:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:57.585 08:24:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:57.585 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.585 ************************************ 00:06:57.585 START TEST scheduler_create_thread 00:06:57.585 ************************************ 00:06:57.585 08:24:50 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:06:57.585 08:24:50 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:57.585 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.585 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.585 2 00:06:57.585 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.585 08:24:50 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:57.585 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.585 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.585 3 00:06:57.585 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.585 08:24:50 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:57.585 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.585 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.585 4 00:06:57.585 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.585 08:24:50 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:57.585 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.586 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.844 5 00:06:57.844 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.844 08:24:50 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:57.844 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.844 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.844 6 00:06:57.844 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.844 08:24:50 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:57.844 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.844 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.844 7 00:06:57.844 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.844 08:24:50 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:57.844 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.844 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.844 8 00:06:57.844 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.844 08:24:50 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:57.844 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.844 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.844 9 00:06:57.844 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.844 08:24:50 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:57.844 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.844 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.844 10 00:06:57.844 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.844 08:24:50 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:57.844 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.844 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:57.844 08:24:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:57.844 08:24:50 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:57.844 08:24:50 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:57.844 08:24:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:57.844 08:24:50 -- common/autotest_common.sh@10 -- # set +x 00:06:58.779 08:24:51 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:58.779 08:24:51 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:58.779 08:24:51 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:58.779 08:24:51 -- common/autotest_common.sh@10 -- # set +x 00:07:00.237 08:24:52 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:00.237 08:24:52 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:00.237 08:24:52 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:00.237 08:24:52 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:00.237 08:24:52 -- common/autotest_common.sh@10 -- # set +x 00:07:01.172 08:24:53 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:01.172 00:07:01.172 real 0m3.382s 00:07:01.172 user 0m0.022s 00:07:01.172 sys 0m0.006s 00:07:01.172 08:24:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.172 08:24:53 -- common/autotest_common.sh@10 -- # set +x 00:07:01.172 ************************************ 00:07:01.172 END TEST scheduler_create_thread 00:07:01.172 ************************************ 00:07:01.172 08:24:53 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:01.172 08:24:53 -- scheduler/scheduler.sh@46 -- # killprocess 993072 00:07:01.172 08:24:53 -- common/autotest_common.sh@926 -- # '[' -z 993072 ']' 00:07:01.172 08:24:53 -- common/autotest_common.sh@930 -- # kill -0 993072 00:07:01.172 08:24:53 -- common/autotest_common.sh@931 -- # uname 00:07:01.172 08:24:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:01.172 08:24:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 993072 00:07:01.172 08:24:53 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:07:01.172 08:24:53 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:07:01.172 08:24:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 993072' 00:07:01.172 killing process with pid 993072 00:07:01.172 08:24:53 -- common/autotest_common.sh@945 -- # kill 993072 00:07:01.172 08:24:53 -- common/autotest_common.sh@950 -- # wait 993072 00:07:01.431 [2024-10-04 08:24:53.986785] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:01.431 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:07:01.431 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:07:01.431 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:07:01.431 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:07:01.431 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:07:01.431 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:07:01.431 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:07:01.431 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:07:01.701 00:07:01.701 real 0m4.398s 00:07:01.701 user 0m7.839s 00:07:01.701 sys 0m0.372s 00:07:01.701 08:24:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.701 08:24:54 -- common/autotest_common.sh@10 -- # set +x 00:07:01.701 ************************************ 00:07:01.701 END TEST event_scheduler 00:07:01.701 ************************************ 00:07:01.701 08:24:54 -- event/event.sh@51 -- # modprobe -n nbd 00:07:01.701 08:24:54 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:01.701 08:24:54 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:01.701 08:24:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:01.701 08:24:54 -- common/autotest_common.sh@10 -- # set +x 00:07:01.701 ************************************ 00:07:01.701 START TEST app_repeat 00:07:01.701 ************************************ 00:07:01.701 08:24:54 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:07:01.701 08:24:54 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:01.701 08:24:54 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:01.701 08:24:54 -- event/event.sh@13 -- # local nbd_list 00:07:01.701 08:24:54 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:01.701 08:24:54 -- event/event.sh@14 -- # local bdev_list 00:07:01.701 08:24:54 -- event/event.sh@15 -- # local repeat_times=4 00:07:01.701 08:24:54 -- event/event.sh@17 -- # modprobe nbd 00:07:01.701 08:24:54 -- event/event.sh@19 -- # repeat_pid=993929 00:07:01.701 08:24:54 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:01.701 08:24:54 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:01.701 08:24:54 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 993929' 00:07:01.701 Process app_repeat pid: 993929 00:07:01.701 08:24:54 -- event/event.sh@23 -- # for i in {0..2} 00:07:01.701 08:24:54 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:01.701 spdk_app_start Round 0 00:07:01.701 08:24:54 -- event/event.sh@25 -- # waitforlisten 993929 /var/tmp/spdk-nbd.sock 00:07:01.701 08:24:54 -- common/autotest_common.sh@819 -- # '[' -z 993929 ']' 00:07:01.701 08:24:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:01.701 08:24:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:01.701 08:24:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:01.701 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:01.702 08:24:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:01.702 08:24:54 -- common/autotest_common.sh@10 -- # set +x 00:07:01.702 [2024-10-04 08:24:54.287135] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:01.702 [2024-10-04 08:24:54.287234] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid993929 ] 00:07:01.702 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.702 [2024-10-04 08:24:54.353623] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:01.969 [2024-10-04 08:24:54.393207] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:01.969 [2024-10-04 08:24:54.393221] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.534 08:24:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:02.534 08:24:55 -- common/autotest_common.sh@852 -- # return 0 00:07:02.534 08:24:55 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:02.792 Malloc0 00:07:02.792 08:24:55 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:03.051 Malloc1 00:07:03.051 08:24:55 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@12 -- # local i 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:03.051 /dev/nbd0 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:03.051 08:24:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:03.051 08:24:55 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:03.051 08:24:55 -- common/autotest_common.sh@857 -- # local i 00:07:03.051 08:24:55 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:03.051 08:24:55 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:03.051 08:24:55 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:03.051 08:24:55 -- common/autotest_common.sh@861 -- # break 00:07:03.051 08:24:55 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:03.051 08:24:55 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:03.051 08:24:55 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:03.051 1+0 records in 00:07:03.051 1+0 records out 00:07:03.051 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232641 s, 17.6 MB/s 00:07:03.051 08:24:55 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:03.309 08:24:55 -- common/autotest_common.sh@874 -- # size=4096 00:07:03.309 08:24:55 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:03.309 08:24:55 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:03.309 08:24:55 -- common/autotest_common.sh@877 -- # return 0 00:07:03.309 08:24:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:03.309 08:24:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:03.309 08:24:55 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:03.309 /dev/nbd1 00:07:03.309 08:24:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:03.309 08:24:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:03.309 08:24:55 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:03.309 08:24:55 -- common/autotest_common.sh@857 -- # local i 00:07:03.309 08:24:55 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:03.309 08:24:55 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:03.310 08:24:55 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:03.310 08:24:55 -- common/autotest_common.sh@861 -- # break 00:07:03.310 08:24:55 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:03.310 08:24:55 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:03.310 08:24:55 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:03.310 1+0 records in 00:07:03.310 1+0 records out 00:07:03.310 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242578 s, 16.9 MB/s 00:07:03.310 08:24:55 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:03.310 08:24:55 -- common/autotest_common.sh@874 -- # size=4096 00:07:03.310 08:24:55 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:03.310 08:24:55 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:03.310 08:24:55 -- common/autotest_common.sh@877 -- # return 0 00:07:03.310 08:24:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:03.310 08:24:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:03.310 08:24:55 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:03.310 08:24:55 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.310 08:24:55 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:03.568 { 00:07:03.568 "nbd_device": "/dev/nbd0", 00:07:03.568 "bdev_name": "Malloc0" 00:07:03.568 }, 00:07:03.568 { 00:07:03.568 "nbd_device": "/dev/nbd1", 00:07:03.568 "bdev_name": "Malloc1" 00:07:03.568 } 00:07:03.568 ]' 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:03.568 { 00:07:03.568 "nbd_device": "/dev/nbd0", 00:07:03.568 "bdev_name": "Malloc0" 00:07:03.568 }, 00:07:03.568 { 00:07:03.568 "nbd_device": "/dev/nbd1", 00:07:03.568 "bdev_name": "Malloc1" 00:07:03.568 } 00:07:03.568 ]' 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:03.568 /dev/nbd1' 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:03.568 /dev/nbd1' 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@65 -- # count=2 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@95 -- # count=2 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:03.568 256+0 records in 00:07:03.568 256+0 records out 00:07:03.568 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103402 s, 101 MB/s 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:03.568 256+0 records in 00:07:03.568 256+0 records out 00:07:03.568 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195638 s, 53.6 MB/s 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:03.568 256+0 records in 00:07:03.568 256+0 records out 00:07:03.568 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0205602 s, 51.0 MB/s 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:03.568 08:24:56 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@51 -- # local i 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@41 -- # break 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@45 -- # return 0 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:03.826 08:24:56 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@41 -- # break 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@45 -- # return 0 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:04.084 08:24:56 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@65 -- # true 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@65 -- # count=0 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@104 -- # count=0 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:04.342 08:24:56 -- bdev/nbd_common.sh@109 -- # return 0 00:07:04.342 08:24:56 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:04.600 08:24:57 -- event/event.sh@35 -- # sleep 3 00:07:04.600 [2024-10-04 08:24:57.262609] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:04.858 [2024-10-04 08:24:57.294929] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.858 [2024-10-04 08:24:57.294930] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.858 [2024-10-04 08:24:57.334559] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:04.858 [2024-10-04 08:24:57.334605] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:08.139 08:25:00 -- event/event.sh@23 -- # for i in {0..2} 00:07:08.139 08:25:00 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:08.139 spdk_app_start Round 1 00:07:08.139 08:25:00 -- event/event.sh@25 -- # waitforlisten 993929 /var/tmp/spdk-nbd.sock 00:07:08.139 08:25:00 -- common/autotest_common.sh@819 -- # '[' -z 993929 ']' 00:07:08.139 08:25:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:08.139 08:25:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:08.139 08:25:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:08.139 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:08.139 08:25:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:08.139 08:25:00 -- common/autotest_common.sh@10 -- # set +x 00:07:08.139 08:25:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:08.139 08:25:00 -- common/autotest_common.sh@852 -- # return 0 00:07:08.139 08:25:00 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:08.139 Malloc0 00:07:08.139 08:25:00 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:08.139 Malloc1 00:07:08.139 08:25:00 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@12 -- # local i 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:08.139 08:25:00 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:08.398 /dev/nbd0 00:07:08.398 08:25:00 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:08.398 08:25:00 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:08.398 08:25:00 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:08.398 08:25:00 -- common/autotest_common.sh@857 -- # local i 00:07:08.398 08:25:00 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:08.398 08:25:00 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:08.398 08:25:00 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:08.398 08:25:00 -- common/autotest_common.sh@861 -- # break 00:07:08.398 08:25:00 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:08.398 08:25:00 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:08.398 08:25:00 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:08.398 1+0 records in 00:07:08.398 1+0 records out 00:07:08.398 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220131 s, 18.6 MB/s 00:07:08.398 08:25:00 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:08.398 08:25:00 -- common/autotest_common.sh@874 -- # size=4096 00:07:08.398 08:25:00 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:08.398 08:25:00 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:08.398 08:25:00 -- common/autotest_common.sh@877 -- # return 0 00:07:08.398 08:25:00 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:08.398 08:25:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:08.399 08:25:00 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:08.399 /dev/nbd1 00:07:08.399 08:25:01 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:08.657 08:25:01 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:08.657 08:25:01 -- common/autotest_common.sh@857 -- # local i 00:07:08.657 08:25:01 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:08.657 08:25:01 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:08.657 08:25:01 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:08.657 08:25:01 -- common/autotest_common.sh@861 -- # break 00:07:08.657 08:25:01 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:08.657 08:25:01 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:08.657 08:25:01 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:08.657 1+0 records in 00:07:08.657 1+0 records out 00:07:08.657 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235625 s, 17.4 MB/s 00:07:08.657 08:25:01 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:08.657 08:25:01 -- common/autotest_common.sh@874 -- # size=4096 00:07:08.657 08:25:01 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:08.657 08:25:01 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:08.657 08:25:01 -- common/autotest_common.sh@877 -- # return 0 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:08.657 { 00:07:08.657 "nbd_device": "/dev/nbd0", 00:07:08.657 "bdev_name": "Malloc0" 00:07:08.657 }, 00:07:08.657 { 00:07:08.657 "nbd_device": "/dev/nbd1", 00:07:08.657 "bdev_name": "Malloc1" 00:07:08.657 } 00:07:08.657 ]' 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:08.657 { 00:07:08.657 "nbd_device": "/dev/nbd0", 00:07:08.657 "bdev_name": "Malloc0" 00:07:08.657 }, 00:07:08.657 { 00:07:08.657 "nbd_device": "/dev/nbd1", 00:07:08.657 "bdev_name": "Malloc1" 00:07:08.657 } 00:07:08.657 ]' 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:08.657 /dev/nbd1' 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:08.657 /dev/nbd1' 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@65 -- # count=2 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@95 -- # count=2 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:08.657 08:25:01 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:08.916 256+0 records in 00:07:08.916 256+0 records out 00:07:08.916 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104855 s, 100 MB/s 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:08.916 256+0 records in 00:07:08.916 256+0 records out 00:07:08.916 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0196357 s, 53.4 MB/s 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:08.916 256+0 records in 00:07:08.916 256+0 records out 00:07:08.916 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209652 s, 50.0 MB/s 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@51 -- # local i 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:08.916 08:25:01 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@41 -- # break 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@45 -- # return 0 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@41 -- # break 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@45 -- # return 0 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:09.176 08:25:01 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@65 -- # true 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@65 -- # count=0 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@104 -- # count=0 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:09.435 08:25:02 -- bdev/nbd_common.sh@109 -- # return 0 00:07:09.435 08:25:02 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:09.694 08:25:02 -- event/event.sh@35 -- # sleep 3 00:07:09.953 [2024-10-04 08:25:02.429716] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:09.953 [2024-10-04 08:25:02.461668] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.953 [2024-10-04 08:25:02.461671] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.953 [2024-10-04 08:25:02.501283] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:09.953 [2024-10-04 08:25:02.501326] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:13.241 08:25:05 -- event/event.sh@23 -- # for i in {0..2} 00:07:13.241 08:25:05 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:13.241 spdk_app_start Round 2 00:07:13.241 08:25:05 -- event/event.sh@25 -- # waitforlisten 993929 /var/tmp/spdk-nbd.sock 00:07:13.241 08:25:05 -- common/autotest_common.sh@819 -- # '[' -z 993929 ']' 00:07:13.241 08:25:05 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:13.241 08:25:05 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:13.242 08:25:05 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:13.242 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:13.242 08:25:05 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:13.242 08:25:05 -- common/autotest_common.sh@10 -- # set +x 00:07:13.242 08:25:05 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:13.242 08:25:05 -- common/autotest_common.sh@852 -- # return 0 00:07:13.242 08:25:05 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:13.242 Malloc0 00:07:13.242 08:25:05 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:13.242 Malloc1 00:07:13.242 08:25:05 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@12 -- # local i 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:13.242 08:25:05 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:13.501 /dev/nbd0 00:07:13.501 08:25:05 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:13.501 08:25:05 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:13.501 08:25:05 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:13.501 08:25:05 -- common/autotest_common.sh@857 -- # local i 00:07:13.501 08:25:05 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:13.501 08:25:05 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:13.501 08:25:05 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:13.501 08:25:05 -- common/autotest_common.sh@861 -- # break 00:07:13.501 08:25:05 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:13.501 08:25:05 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:13.501 08:25:05 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:13.501 1+0 records in 00:07:13.501 1+0 records out 00:07:13.501 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249483 s, 16.4 MB/s 00:07:13.501 08:25:06 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:13.501 08:25:06 -- common/autotest_common.sh@874 -- # size=4096 00:07:13.501 08:25:06 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:13.501 08:25:06 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:13.501 08:25:06 -- common/autotest_common.sh@877 -- # return 0 00:07:13.501 08:25:06 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:13.501 08:25:06 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:13.501 08:25:06 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:13.759 /dev/nbd1 00:07:13.759 08:25:06 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:13.759 08:25:06 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:13.759 08:25:06 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:13.759 08:25:06 -- common/autotest_common.sh@857 -- # local i 00:07:13.759 08:25:06 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:13.759 08:25:06 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:13.759 08:25:06 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:13.759 08:25:06 -- common/autotest_common.sh@861 -- # break 00:07:13.759 08:25:06 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:13.759 08:25:06 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:13.759 08:25:06 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:13.759 1+0 records in 00:07:13.759 1+0 records out 00:07:13.759 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258308 s, 15.9 MB/s 00:07:13.759 08:25:06 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:13.759 08:25:06 -- common/autotest_common.sh@874 -- # size=4096 00:07:13.759 08:25:06 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:13.760 08:25:06 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:13.760 08:25:06 -- common/autotest_common.sh@877 -- # return 0 00:07:13.760 08:25:06 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:13.760 08:25:06 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:13.760 08:25:06 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:13.760 08:25:06 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.760 08:25:06 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:13.760 08:25:06 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:13.760 { 00:07:13.760 "nbd_device": "/dev/nbd0", 00:07:13.760 "bdev_name": "Malloc0" 00:07:13.760 }, 00:07:13.760 { 00:07:13.760 "nbd_device": "/dev/nbd1", 00:07:13.760 "bdev_name": "Malloc1" 00:07:13.760 } 00:07:13.760 ]' 00:07:13.760 08:25:06 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:13.760 { 00:07:13.760 "nbd_device": "/dev/nbd0", 00:07:13.760 "bdev_name": "Malloc0" 00:07:13.760 }, 00:07:13.760 { 00:07:13.760 "nbd_device": "/dev/nbd1", 00:07:13.760 "bdev_name": "Malloc1" 00:07:13.760 } 00:07:13.760 ]' 00:07:13.760 08:25:06 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:14.018 /dev/nbd1' 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:14.018 /dev/nbd1' 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@65 -- # count=2 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@95 -- # count=2 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.018 08:25:06 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:14.019 256+0 records in 00:07:14.019 256+0 records out 00:07:14.019 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104619 s, 100 MB/s 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:14.019 256+0 records in 00:07:14.019 256+0 records out 00:07:14.019 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0196973 s, 53.2 MB/s 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:14.019 256+0 records in 00:07:14.019 256+0 records out 00:07:14.019 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209429 s, 50.1 MB/s 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@51 -- # local i 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:14.019 08:25:06 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@41 -- # break 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@45 -- # return 0 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@41 -- # break 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@45 -- # return 0 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.278 08:25:06 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@65 -- # true 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@65 -- # count=0 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@104 -- # count=0 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:14.537 08:25:07 -- bdev/nbd_common.sh@109 -- # return 0 00:07:14.537 08:25:07 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:14.794 08:25:07 -- event/event.sh@35 -- # sleep 3 00:07:15.053 [2024-10-04 08:25:07.533583] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:15.053 [2024-10-04 08:25:07.565588] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:15.053 [2024-10-04 08:25:07.565590] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.053 [2024-10-04 08:25:07.605318] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:15.053 [2024-10-04 08:25:07.605362] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:18.339 08:25:10 -- event/event.sh@38 -- # waitforlisten 993929 /var/tmp/spdk-nbd.sock 00:07:18.339 08:25:10 -- common/autotest_common.sh@819 -- # '[' -z 993929 ']' 00:07:18.339 08:25:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:18.339 08:25:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:18.339 08:25:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:18.339 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:18.339 08:25:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:18.339 08:25:10 -- common/autotest_common.sh@10 -- # set +x 00:07:18.339 08:25:10 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:18.339 08:25:10 -- common/autotest_common.sh@852 -- # return 0 00:07:18.339 08:25:10 -- event/event.sh@39 -- # killprocess 993929 00:07:18.339 08:25:10 -- common/autotest_common.sh@926 -- # '[' -z 993929 ']' 00:07:18.339 08:25:10 -- common/autotest_common.sh@930 -- # kill -0 993929 00:07:18.339 08:25:10 -- common/autotest_common.sh@931 -- # uname 00:07:18.339 08:25:10 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:18.339 08:25:10 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 993929 00:07:18.339 08:25:10 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:18.339 08:25:10 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:18.339 08:25:10 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 993929' 00:07:18.339 killing process with pid 993929 00:07:18.339 08:25:10 -- common/autotest_common.sh@945 -- # kill 993929 00:07:18.339 08:25:10 -- common/autotest_common.sh@950 -- # wait 993929 00:07:18.339 spdk_app_start is called in Round 0. 00:07:18.339 Shutdown signal received, stop current app iteration 00:07:18.339 Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 reinitialization... 00:07:18.339 spdk_app_start is called in Round 1. 00:07:18.339 Shutdown signal received, stop current app iteration 00:07:18.339 Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 reinitialization... 00:07:18.339 spdk_app_start is called in Round 2. 00:07:18.339 Shutdown signal received, stop current app iteration 00:07:18.339 Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 reinitialization... 00:07:18.339 spdk_app_start is called in Round 3. 00:07:18.339 Shutdown signal received, stop current app iteration 00:07:18.339 08:25:10 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:18.339 08:25:10 -- event/event.sh@42 -- # return 0 00:07:18.339 00:07:18.339 real 0m16.499s 00:07:18.339 user 0m35.477s 00:07:18.339 sys 0m3.002s 00:07:18.340 08:25:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.340 08:25:10 -- common/autotest_common.sh@10 -- # set +x 00:07:18.340 ************************************ 00:07:18.340 END TEST app_repeat 00:07:18.340 ************************************ 00:07:18.340 08:25:10 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:18.340 08:25:10 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:18.340 08:25:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:18.340 08:25:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:18.340 08:25:10 -- common/autotest_common.sh@10 -- # set +x 00:07:18.340 ************************************ 00:07:18.340 START TEST cpu_locks 00:07:18.340 ************************************ 00:07:18.340 08:25:10 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:18.340 * Looking for test storage... 00:07:18.340 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:18.340 08:25:10 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:18.340 08:25:10 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:18.340 08:25:10 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:18.340 08:25:10 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:18.340 08:25:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:18.340 08:25:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:18.340 08:25:10 -- common/autotest_common.sh@10 -- # set +x 00:07:18.340 ************************************ 00:07:18.340 START TEST default_locks 00:07:18.340 ************************************ 00:07:18.340 08:25:10 -- common/autotest_common.sh@1104 -- # default_locks 00:07:18.340 08:25:10 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=997005 00:07:18.340 08:25:10 -- event/cpu_locks.sh@47 -- # waitforlisten 997005 00:07:18.340 08:25:10 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:18.340 08:25:10 -- common/autotest_common.sh@819 -- # '[' -z 997005 ']' 00:07:18.340 08:25:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:18.340 08:25:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:18.340 08:25:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:18.340 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:18.340 08:25:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:18.340 08:25:10 -- common/autotest_common.sh@10 -- # set +x 00:07:18.340 [2024-10-04 08:25:10.936435] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:18.340 [2024-10-04 08:25:10.936524] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid997005 ] 00:07:18.340 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.340 [2024-10-04 08:25:11.006633] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.599 [2024-10-04 08:25:11.044719] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:18.599 [2024-10-04 08:25:11.044831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.166 08:25:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:19.166 08:25:11 -- common/autotest_common.sh@852 -- # return 0 00:07:19.166 08:25:11 -- event/cpu_locks.sh@49 -- # locks_exist 997005 00:07:19.166 08:25:11 -- event/cpu_locks.sh@22 -- # lslocks -p 997005 00:07:19.166 08:25:11 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:19.734 lslocks: write error 00:07:19.734 08:25:12 -- event/cpu_locks.sh@50 -- # killprocess 997005 00:07:19.734 08:25:12 -- common/autotest_common.sh@926 -- # '[' -z 997005 ']' 00:07:19.734 08:25:12 -- common/autotest_common.sh@930 -- # kill -0 997005 00:07:19.734 08:25:12 -- common/autotest_common.sh@931 -- # uname 00:07:19.734 08:25:12 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:19.734 08:25:12 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 997005 00:07:19.734 08:25:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:19.734 08:25:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:19.734 08:25:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 997005' 00:07:19.734 killing process with pid 997005 00:07:19.734 08:25:12 -- common/autotest_common.sh@945 -- # kill 997005 00:07:19.734 08:25:12 -- common/autotest_common.sh@950 -- # wait 997005 00:07:19.993 08:25:12 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 997005 00:07:19.993 08:25:12 -- common/autotest_common.sh@640 -- # local es=0 00:07:19.993 08:25:12 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 997005 00:07:19.993 08:25:12 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:07:19.993 08:25:12 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:19.993 08:25:12 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:07:19.993 08:25:12 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:19.993 08:25:12 -- common/autotest_common.sh@643 -- # waitforlisten 997005 00:07:19.993 08:25:12 -- common/autotest_common.sh@819 -- # '[' -z 997005 ']' 00:07:19.993 08:25:12 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:19.993 08:25:12 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:19.993 08:25:12 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:19.993 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:19.993 08:25:12 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:19.993 08:25:12 -- common/autotest_common.sh@10 -- # set +x 00:07:19.993 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (997005) - No such process 00:07:19.993 ERROR: process (pid: 997005) is no longer running 00:07:19.993 08:25:12 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:19.993 08:25:12 -- common/autotest_common.sh@852 -- # return 1 00:07:19.993 08:25:12 -- common/autotest_common.sh@643 -- # es=1 00:07:19.993 08:25:12 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:19.993 08:25:12 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:19.993 08:25:12 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:19.993 08:25:12 -- event/cpu_locks.sh@54 -- # no_locks 00:07:19.993 08:25:12 -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:19.993 08:25:12 -- event/cpu_locks.sh@26 -- # local lock_files 00:07:19.993 08:25:12 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:19.993 00:07:19.993 real 0m1.577s 00:07:19.993 user 0m1.653s 00:07:19.993 sys 0m0.549s 00:07:19.993 08:25:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.993 08:25:12 -- common/autotest_common.sh@10 -- # set +x 00:07:19.993 ************************************ 00:07:19.993 END TEST default_locks 00:07:19.993 ************************************ 00:07:19.993 08:25:12 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:19.993 08:25:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:19.993 08:25:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:19.993 08:25:12 -- common/autotest_common.sh@10 -- # set +x 00:07:19.993 ************************************ 00:07:19.993 START TEST default_locks_via_rpc 00:07:19.993 ************************************ 00:07:19.993 08:25:12 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:07:19.993 08:25:12 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=997338 00:07:19.993 08:25:12 -- event/cpu_locks.sh@63 -- # waitforlisten 997338 00:07:19.993 08:25:12 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:19.993 08:25:12 -- common/autotest_common.sh@819 -- # '[' -z 997338 ']' 00:07:19.993 08:25:12 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:19.993 08:25:12 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:19.993 08:25:12 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:19.993 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:19.993 08:25:12 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:19.993 08:25:12 -- common/autotest_common.sh@10 -- # set +x 00:07:19.993 [2024-10-04 08:25:12.557675] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:19.993 [2024-10-04 08:25:12.557765] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid997338 ] 00:07:19.993 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.993 [2024-10-04 08:25:12.625412] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.993 [2024-10-04 08:25:12.662854] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:19.993 [2024-10-04 08:25:12.662963] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.931 08:25:13 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:20.931 08:25:13 -- common/autotest_common.sh@852 -- # return 0 00:07:20.931 08:25:13 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:20.931 08:25:13 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:20.931 08:25:13 -- common/autotest_common.sh@10 -- # set +x 00:07:20.931 08:25:13 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:20.931 08:25:13 -- event/cpu_locks.sh@67 -- # no_locks 00:07:20.931 08:25:13 -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:20.931 08:25:13 -- event/cpu_locks.sh@26 -- # local lock_files 00:07:20.931 08:25:13 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:20.931 08:25:13 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:20.931 08:25:13 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:20.931 08:25:13 -- common/autotest_common.sh@10 -- # set +x 00:07:20.931 08:25:13 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:20.931 08:25:13 -- event/cpu_locks.sh@71 -- # locks_exist 997338 00:07:20.931 08:25:13 -- event/cpu_locks.sh@22 -- # lslocks -p 997338 00:07:20.931 08:25:13 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:21.191 08:25:13 -- event/cpu_locks.sh@73 -- # killprocess 997338 00:07:21.191 08:25:13 -- common/autotest_common.sh@926 -- # '[' -z 997338 ']' 00:07:21.191 08:25:13 -- common/autotest_common.sh@930 -- # kill -0 997338 00:07:21.191 08:25:13 -- common/autotest_common.sh@931 -- # uname 00:07:21.191 08:25:13 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:21.191 08:25:13 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 997338 00:07:21.191 08:25:13 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:21.191 08:25:13 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:21.191 08:25:13 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 997338' 00:07:21.191 killing process with pid 997338 00:07:21.191 08:25:13 -- common/autotest_common.sh@945 -- # kill 997338 00:07:21.191 08:25:13 -- common/autotest_common.sh@950 -- # wait 997338 00:07:21.451 00:07:21.451 real 0m1.517s 00:07:21.451 user 0m1.601s 00:07:21.451 sys 0m0.528s 00:07:21.451 08:25:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.451 08:25:14 -- common/autotest_common.sh@10 -- # set +x 00:07:21.451 ************************************ 00:07:21.451 END TEST default_locks_via_rpc 00:07:21.451 ************************************ 00:07:21.451 08:25:14 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:21.451 08:25:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:21.451 08:25:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:21.451 08:25:14 -- common/autotest_common.sh@10 -- # set +x 00:07:21.451 ************************************ 00:07:21.451 START TEST non_locking_app_on_locked_coremask 00:07:21.451 ************************************ 00:07:21.451 08:25:14 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:07:21.451 08:25:14 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=997641 00:07:21.451 08:25:14 -- event/cpu_locks.sh@81 -- # waitforlisten 997641 /var/tmp/spdk.sock 00:07:21.451 08:25:14 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:21.451 08:25:14 -- common/autotest_common.sh@819 -- # '[' -z 997641 ']' 00:07:21.451 08:25:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:21.451 08:25:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:21.451 08:25:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:21.451 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:21.451 08:25:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:21.451 08:25:14 -- common/autotest_common.sh@10 -- # set +x 00:07:21.451 [2024-10-04 08:25:14.119305] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:21.451 [2024-10-04 08:25:14.119386] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid997641 ] 00:07:21.711 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.711 [2024-10-04 08:25:14.185818] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.711 [2024-10-04 08:25:14.222589] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:21.711 [2024-10-04 08:25:14.222699] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.280 08:25:14 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:22.280 08:25:14 -- common/autotest_common.sh@852 -- # return 0 00:07:22.280 08:25:14 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=997741 00:07:22.280 08:25:14 -- event/cpu_locks.sh@85 -- # waitforlisten 997741 /var/tmp/spdk2.sock 00:07:22.280 08:25:14 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:22.280 08:25:14 -- common/autotest_common.sh@819 -- # '[' -z 997741 ']' 00:07:22.280 08:25:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:22.280 08:25:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:22.280 08:25:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:22.280 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:22.280 08:25:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:22.280 08:25:14 -- common/autotest_common.sh@10 -- # set +x 00:07:22.540 [2024-10-04 08:25:14.963087] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:22.540 [2024-10-04 08:25:14.963175] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid997741 ] 00:07:22.540 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.540 [2024-10-04 08:25:15.050996] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:22.540 [2024-10-04 08:25:15.051020] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.540 [2024-10-04 08:25:15.122999] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:22.540 [2024-10-04 08:25:15.123111] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.108 08:25:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:23.108 08:25:15 -- common/autotest_common.sh@852 -- # return 0 00:07:23.366 08:25:15 -- event/cpu_locks.sh@87 -- # locks_exist 997641 00:07:23.366 08:25:15 -- event/cpu_locks.sh@22 -- # lslocks -p 997641 00:07:23.366 08:25:15 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:24.303 lslocks: write error 00:07:24.303 08:25:16 -- event/cpu_locks.sh@89 -- # killprocess 997641 00:07:24.303 08:25:16 -- common/autotest_common.sh@926 -- # '[' -z 997641 ']' 00:07:24.303 08:25:16 -- common/autotest_common.sh@930 -- # kill -0 997641 00:07:24.303 08:25:16 -- common/autotest_common.sh@931 -- # uname 00:07:24.303 08:25:16 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:24.303 08:25:16 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 997641 00:07:24.303 08:25:16 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:24.303 08:25:16 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:24.303 08:25:16 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 997641' 00:07:24.303 killing process with pid 997641 00:07:24.303 08:25:16 -- common/autotest_common.sh@945 -- # kill 997641 00:07:24.303 08:25:16 -- common/autotest_common.sh@950 -- # wait 997641 00:07:25.240 08:25:17 -- event/cpu_locks.sh@90 -- # killprocess 997741 00:07:25.240 08:25:17 -- common/autotest_common.sh@926 -- # '[' -z 997741 ']' 00:07:25.240 08:25:17 -- common/autotest_common.sh@930 -- # kill -0 997741 00:07:25.240 08:25:17 -- common/autotest_common.sh@931 -- # uname 00:07:25.240 08:25:17 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:25.240 08:25:17 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 997741 00:07:25.240 08:25:17 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:25.240 08:25:17 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:25.240 08:25:17 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 997741' 00:07:25.240 killing process with pid 997741 00:07:25.240 08:25:17 -- common/autotest_common.sh@945 -- # kill 997741 00:07:25.240 08:25:17 -- common/autotest_common.sh@950 -- # wait 997741 00:07:25.240 00:07:25.240 real 0m3.819s 00:07:25.240 user 0m4.085s 00:07:25.240 sys 0m1.290s 00:07:25.241 08:25:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.241 08:25:17 -- common/autotest_common.sh@10 -- # set +x 00:07:25.241 ************************************ 00:07:25.241 END TEST non_locking_app_on_locked_coremask 00:07:25.241 ************************************ 00:07:25.499 08:25:17 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:25.499 08:25:17 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:25.499 08:25:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.499 08:25:17 -- common/autotest_common.sh@10 -- # set +x 00:07:25.499 ************************************ 00:07:25.499 START TEST locking_app_on_unlocked_coremask 00:07:25.499 ************************************ 00:07:25.499 08:25:17 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:07:25.499 08:25:17 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:25.499 08:25:17 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=998318 00:07:25.499 08:25:17 -- event/cpu_locks.sh@99 -- # waitforlisten 998318 /var/tmp/spdk.sock 00:07:25.499 08:25:17 -- common/autotest_common.sh@819 -- # '[' -z 998318 ']' 00:07:25.499 08:25:17 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.499 08:25:17 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:25.499 08:25:17 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.499 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.499 08:25:17 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:25.499 08:25:17 -- common/autotest_common.sh@10 -- # set +x 00:07:25.499 [2024-10-04 08:25:17.977005] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:25.499 [2024-10-04 08:25:17.977056] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid998318 ] 00:07:25.499 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.499 [2024-10-04 08:25:18.039966] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:25.499 [2024-10-04 08:25:18.039991] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.499 [2024-10-04 08:25:18.077599] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:25.499 [2024-10-04 08:25:18.077707] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.433 08:25:18 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:26.433 08:25:18 -- common/autotest_common.sh@852 -- # return 0 00:07:26.433 08:25:18 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:26.433 08:25:18 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=998540 00:07:26.433 08:25:18 -- event/cpu_locks.sh@103 -- # waitforlisten 998540 /var/tmp/spdk2.sock 00:07:26.433 08:25:18 -- common/autotest_common.sh@819 -- # '[' -z 998540 ']' 00:07:26.433 08:25:18 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:26.433 08:25:18 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:26.433 08:25:18 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:26.433 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:26.433 08:25:18 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:26.433 08:25:18 -- common/autotest_common.sh@10 -- # set +x 00:07:26.433 [2024-10-04 08:25:18.817859] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:26.433 [2024-10-04 08:25:18.817909] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid998540 ] 00:07:26.433 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.433 [2024-10-04 08:25:18.904141] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.433 [2024-10-04 08:25:18.981886] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:26.433 [2024-10-04 08:25:18.981995] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.000 08:25:19 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:27.000 08:25:19 -- common/autotest_common.sh@852 -- # return 0 00:07:27.000 08:25:19 -- event/cpu_locks.sh@105 -- # locks_exist 998540 00:07:27.000 08:25:19 -- event/cpu_locks.sh@22 -- # lslocks -p 998540 00:07:27.000 08:25:19 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:27.935 lslocks: write error 00:07:27.935 08:25:20 -- event/cpu_locks.sh@107 -- # killprocess 998318 00:07:27.935 08:25:20 -- common/autotest_common.sh@926 -- # '[' -z 998318 ']' 00:07:27.935 08:25:20 -- common/autotest_common.sh@930 -- # kill -0 998318 00:07:27.935 08:25:20 -- common/autotest_common.sh@931 -- # uname 00:07:27.935 08:25:20 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:27.935 08:25:20 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 998318 00:07:27.935 08:25:20 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:27.935 08:25:20 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:27.935 08:25:20 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 998318' 00:07:27.935 killing process with pid 998318 00:07:27.935 08:25:20 -- common/autotest_common.sh@945 -- # kill 998318 00:07:27.935 08:25:20 -- common/autotest_common.sh@950 -- # wait 998318 00:07:28.517 08:25:20 -- event/cpu_locks.sh@108 -- # killprocess 998540 00:07:28.517 08:25:20 -- common/autotest_common.sh@926 -- # '[' -z 998540 ']' 00:07:28.517 08:25:20 -- common/autotest_common.sh@930 -- # kill -0 998540 00:07:28.517 08:25:20 -- common/autotest_common.sh@931 -- # uname 00:07:28.517 08:25:20 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:28.517 08:25:20 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 998540 00:07:28.517 08:25:20 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:28.517 08:25:20 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:28.517 08:25:20 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 998540' 00:07:28.517 killing process with pid 998540 00:07:28.517 08:25:20 -- common/autotest_common.sh@945 -- # kill 998540 00:07:28.517 08:25:20 -- common/autotest_common.sh@950 -- # wait 998540 00:07:28.849 00:07:28.849 real 0m3.288s 00:07:28.849 user 0m3.524s 00:07:28.849 sys 0m1.047s 00:07:28.849 08:25:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.849 08:25:21 -- common/autotest_common.sh@10 -- # set +x 00:07:28.849 ************************************ 00:07:28.849 END TEST locking_app_on_unlocked_coremask 00:07:28.849 ************************************ 00:07:28.849 08:25:21 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:28.849 08:25:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:28.849 08:25:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:28.849 08:25:21 -- common/autotest_common.sh@10 -- # set +x 00:07:28.849 ************************************ 00:07:28.849 START TEST locking_app_on_locked_coremask 00:07:28.849 ************************************ 00:07:28.849 08:25:21 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:07:28.849 08:25:21 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=998898 00:07:28.849 08:25:21 -- event/cpu_locks.sh@116 -- # waitforlisten 998898 /var/tmp/spdk.sock 00:07:28.849 08:25:21 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:28.849 08:25:21 -- common/autotest_common.sh@819 -- # '[' -z 998898 ']' 00:07:28.849 08:25:21 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:28.849 08:25:21 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:28.849 08:25:21 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:28.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:28.849 08:25:21 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:28.849 08:25:21 -- common/autotest_common.sh@10 -- # set +x 00:07:28.849 [2024-10-04 08:25:21.327277] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:28.849 [2024-10-04 08:25:21.327368] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid998898 ] 00:07:28.849 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.849 [2024-10-04 08:25:21.394067] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.849 [2024-10-04 08:25:21.426288] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:28.849 [2024-10-04 08:25:21.426402] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.785 08:25:22 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:29.785 08:25:22 -- common/autotest_common.sh@852 -- # return 0 00:07:29.785 08:25:22 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=999167 00:07:29.785 08:25:22 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 999167 /var/tmp/spdk2.sock 00:07:29.785 08:25:22 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:29.785 08:25:22 -- common/autotest_common.sh@640 -- # local es=0 00:07:29.785 08:25:22 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 999167 /var/tmp/spdk2.sock 00:07:29.785 08:25:22 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:07:29.785 08:25:22 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:29.785 08:25:22 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:07:29.785 08:25:22 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:29.785 08:25:22 -- common/autotest_common.sh@643 -- # waitforlisten 999167 /var/tmp/spdk2.sock 00:07:29.785 08:25:22 -- common/autotest_common.sh@819 -- # '[' -z 999167 ']' 00:07:29.785 08:25:22 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:29.785 08:25:22 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:29.785 08:25:22 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:29.785 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:29.785 08:25:22 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:29.785 08:25:22 -- common/autotest_common.sh@10 -- # set +x 00:07:29.785 [2024-10-04 08:25:22.180491] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:29.785 [2024-10-04 08:25:22.180579] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid999167 ] 00:07:29.785 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.786 [2024-10-04 08:25:22.274780] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 998898 has claimed it. 00:07:29.786 [2024-10-04 08:25:22.274821] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:30.353 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (999167) - No such process 00:07:30.353 ERROR: process (pid: 999167) is no longer running 00:07:30.353 08:25:22 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:30.353 08:25:22 -- common/autotest_common.sh@852 -- # return 1 00:07:30.353 08:25:22 -- common/autotest_common.sh@643 -- # es=1 00:07:30.353 08:25:22 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:30.353 08:25:22 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:30.353 08:25:22 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:30.353 08:25:22 -- event/cpu_locks.sh@122 -- # locks_exist 998898 00:07:30.353 08:25:22 -- event/cpu_locks.sh@22 -- # lslocks -p 998898 00:07:30.353 08:25:22 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:30.612 lslocks: write error 00:07:30.612 08:25:23 -- event/cpu_locks.sh@124 -- # killprocess 998898 00:07:30.612 08:25:23 -- common/autotest_common.sh@926 -- # '[' -z 998898 ']' 00:07:30.612 08:25:23 -- common/autotest_common.sh@930 -- # kill -0 998898 00:07:30.612 08:25:23 -- common/autotest_common.sh@931 -- # uname 00:07:30.612 08:25:23 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:30.612 08:25:23 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 998898 00:07:30.612 08:25:23 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:30.612 08:25:23 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:30.612 08:25:23 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 998898' 00:07:30.612 killing process with pid 998898 00:07:30.612 08:25:23 -- common/autotest_common.sh@945 -- # kill 998898 00:07:30.612 08:25:23 -- common/autotest_common.sh@950 -- # wait 998898 00:07:30.870 00:07:30.871 real 0m2.117s 00:07:30.871 user 0m2.336s 00:07:30.871 sys 0m0.602s 00:07:30.871 08:25:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.871 08:25:23 -- common/autotest_common.sh@10 -- # set +x 00:07:30.871 ************************************ 00:07:30.871 END TEST locking_app_on_locked_coremask 00:07:30.871 ************************************ 00:07:30.871 08:25:23 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:30.871 08:25:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:30.871 08:25:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:30.871 08:25:23 -- common/autotest_common.sh@10 -- # set +x 00:07:30.871 ************************************ 00:07:30.871 START TEST locking_overlapped_coremask 00:07:30.871 ************************************ 00:07:30.871 08:25:23 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:07:30.871 08:25:23 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=999436 00:07:30.871 08:25:23 -- event/cpu_locks.sh@133 -- # waitforlisten 999436 /var/tmp/spdk.sock 00:07:30.871 08:25:23 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:30.871 08:25:23 -- common/autotest_common.sh@819 -- # '[' -z 999436 ']' 00:07:30.871 08:25:23 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:30.871 08:25:23 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:30.871 08:25:23 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:30.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:30.871 08:25:23 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:30.871 08:25:23 -- common/autotest_common.sh@10 -- # set +x 00:07:30.871 [2024-10-04 08:25:23.487360] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:30.871 [2024-10-04 08:25:23.487450] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid999436 ] 00:07:30.871 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.129 [2024-10-04 08:25:23.555241] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:31.129 [2024-10-04 08:25:23.593914] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.129 [2024-10-04 08:25:23.594059] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:31.129 [2024-10-04 08:25:23.594176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.129 [2024-10-04 08:25:23.594177] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:31.696 08:25:24 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:31.696 08:25:24 -- common/autotest_common.sh@852 -- # return 0 00:07:31.696 08:25:24 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=999482 00:07:31.696 08:25:24 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 999482 /var/tmp/spdk2.sock 00:07:31.697 08:25:24 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:31.697 08:25:24 -- common/autotest_common.sh@640 -- # local es=0 00:07:31.697 08:25:24 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 999482 /var/tmp/spdk2.sock 00:07:31.697 08:25:24 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:07:31.697 08:25:24 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:31.697 08:25:24 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:07:31.697 08:25:24 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:31.697 08:25:24 -- common/autotest_common.sh@643 -- # waitforlisten 999482 /var/tmp/spdk2.sock 00:07:31.697 08:25:24 -- common/autotest_common.sh@819 -- # '[' -z 999482 ']' 00:07:31.697 08:25:24 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:31.697 08:25:24 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:31.697 08:25:24 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:31.697 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:31.697 08:25:24 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:31.697 08:25:24 -- common/autotest_common.sh@10 -- # set +x 00:07:31.697 [2024-10-04 08:25:24.347797] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:31.697 [2024-10-04 08:25:24.347884] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid999482 ] 00:07:31.956 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.956 [2024-10-04 08:25:24.442601] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 999436 has claimed it. 00:07:31.956 [2024-10-04 08:25:24.442644] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:32.524 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (999482) - No such process 00:07:32.524 ERROR: process (pid: 999482) is no longer running 00:07:32.524 08:25:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:32.524 08:25:25 -- common/autotest_common.sh@852 -- # return 1 00:07:32.524 08:25:25 -- common/autotest_common.sh@643 -- # es=1 00:07:32.524 08:25:25 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:32.524 08:25:25 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:32.524 08:25:25 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:32.524 08:25:25 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:32.524 08:25:25 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:32.524 08:25:25 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:32.524 08:25:25 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:32.524 08:25:25 -- event/cpu_locks.sh@141 -- # killprocess 999436 00:07:32.524 08:25:25 -- common/autotest_common.sh@926 -- # '[' -z 999436 ']' 00:07:32.524 08:25:25 -- common/autotest_common.sh@930 -- # kill -0 999436 00:07:32.524 08:25:25 -- common/autotest_common.sh@931 -- # uname 00:07:32.524 08:25:25 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:32.524 08:25:25 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 999436 00:07:32.524 08:25:25 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:32.524 08:25:25 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:32.524 08:25:25 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 999436' 00:07:32.524 killing process with pid 999436 00:07:32.524 08:25:25 -- common/autotest_common.sh@945 -- # kill 999436 00:07:32.524 08:25:25 -- common/autotest_common.sh@950 -- # wait 999436 00:07:32.784 00:07:32.784 real 0m1.906s 00:07:32.784 user 0m5.504s 00:07:32.784 sys 0m0.453s 00:07:32.784 08:25:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.784 08:25:25 -- common/autotest_common.sh@10 -- # set +x 00:07:32.784 ************************************ 00:07:32.784 END TEST locking_overlapped_coremask 00:07:32.784 ************************************ 00:07:32.784 08:25:25 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:32.784 08:25:25 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:32.784 08:25:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:32.784 08:25:25 -- common/autotest_common.sh@10 -- # set +x 00:07:32.784 ************************************ 00:07:32.784 START TEST locking_overlapped_coremask_via_rpc 00:07:32.784 ************************************ 00:07:32.784 08:25:25 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:07:32.784 08:25:25 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=999772 00:07:32.784 08:25:25 -- event/cpu_locks.sh@149 -- # waitforlisten 999772 /var/tmp/spdk.sock 00:07:32.784 08:25:25 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:32.784 08:25:25 -- common/autotest_common.sh@819 -- # '[' -z 999772 ']' 00:07:32.784 08:25:25 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.784 08:25:25 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:32.784 08:25:25 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.784 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.784 08:25:25 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:32.784 08:25:25 -- common/autotest_common.sh@10 -- # set +x 00:07:32.784 [2024-10-04 08:25:25.438077] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:32.784 [2024-10-04 08:25:25.438165] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid999772 ] 00:07:33.043 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.043 [2024-10-04 08:25:25.507094] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:33.043 [2024-10-04 08:25:25.507119] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:33.043 [2024-10-04 08:25:25.545860] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.043 [2024-10-04 08:25:25.546002] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:33.043 [2024-10-04 08:25:25.546100] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:33.043 [2024-10-04 08:25:25.546102] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.611 08:25:26 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:33.611 08:25:26 -- common/autotest_common.sh@852 -- # return 0 00:07:33.612 08:25:26 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=999882 00:07:33.612 08:25:26 -- event/cpu_locks.sh@153 -- # waitforlisten 999882 /var/tmp/spdk2.sock 00:07:33.612 08:25:26 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:33.612 08:25:26 -- common/autotest_common.sh@819 -- # '[' -z 999882 ']' 00:07:33.612 08:25:26 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:33.612 08:25:26 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:33.612 08:25:26 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:33.612 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:33.612 08:25:26 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:33.612 08:25:26 -- common/autotest_common.sh@10 -- # set +x 00:07:33.870 [2024-10-04 08:25:26.309524] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:33.870 [2024-10-04 08:25:26.309612] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid999882 ] 00:07:33.870 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.870 [2024-10-04 08:25:26.401694] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:33.870 [2024-10-04 08:25:26.401726] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:33.870 [2024-10-04 08:25:26.480563] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.870 [2024-10-04 08:25:26.480709] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:33.870 [2024-10-04 08:25:26.484237] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:33.870 [2024-10-04 08:25:26.484239] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:07:34.804 08:25:27 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:34.804 08:25:27 -- common/autotest_common.sh@852 -- # return 0 00:07:34.804 08:25:27 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:34.804 08:25:27 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:34.804 08:25:27 -- common/autotest_common.sh@10 -- # set +x 00:07:34.804 08:25:27 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:34.804 08:25:27 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:34.804 08:25:27 -- common/autotest_common.sh@640 -- # local es=0 00:07:34.804 08:25:27 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:34.804 08:25:27 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:07:34.804 08:25:27 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:34.804 08:25:27 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:07:34.804 08:25:27 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:34.804 08:25:27 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:34.804 08:25:27 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:34.804 08:25:27 -- common/autotest_common.sh@10 -- # set +x 00:07:34.804 [2024-10-04 08:25:27.175258] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 999772 has claimed it. 00:07:34.804 request: 00:07:34.804 { 00:07:34.804 "method": "framework_enable_cpumask_locks", 00:07:34.804 "req_id": 1 00:07:34.804 } 00:07:34.804 Got JSON-RPC error response 00:07:34.804 response: 00:07:34.804 { 00:07:34.804 "code": -32603, 00:07:34.804 "message": "Failed to claim CPU core: 2" 00:07:34.804 } 00:07:34.804 08:25:27 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:07:34.804 08:25:27 -- common/autotest_common.sh@643 -- # es=1 00:07:34.804 08:25:27 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:34.804 08:25:27 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:34.804 08:25:27 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:34.804 08:25:27 -- event/cpu_locks.sh@158 -- # waitforlisten 999772 /var/tmp/spdk.sock 00:07:34.804 08:25:27 -- common/autotest_common.sh@819 -- # '[' -z 999772 ']' 00:07:34.804 08:25:27 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:34.804 08:25:27 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:34.804 08:25:27 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:34.804 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:34.804 08:25:27 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:34.804 08:25:27 -- common/autotest_common.sh@10 -- # set +x 00:07:34.804 08:25:27 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:34.804 08:25:27 -- common/autotest_common.sh@852 -- # return 0 00:07:34.804 08:25:27 -- event/cpu_locks.sh@159 -- # waitforlisten 999882 /var/tmp/spdk2.sock 00:07:34.804 08:25:27 -- common/autotest_common.sh@819 -- # '[' -z 999882 ']' 00:07:34.804 08:25:27 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:34.804 08:25:27 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:34.804 08:25:27 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:34.804 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:34.804 08:25:27 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:34.804 08:25:27 -- common/autotest_common.sh@10 -- # set +x 00:07:35.062 08:25:27 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:35.062 08:25:27 -- common/autotest_common.sh@852 -- # return 0 00:07:35.062 08:25:27 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:35.062 08:25:27 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:35.062 08:25:27 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:35.062 08:25:27 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:35.062 00:07:35.062 real 0m2.148s 00:07:35.062 user 0m0.892s 00:07:35.062 sys 0m0.182s 00:07:35.062 08:25:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.062 08:25:27 -- common/autotest_common.sh@10 -- # set +x 00:07:35.062 ************************************ 00:07:35.062 END TEST locking_overlapped_coremask_via_rpc 00:07:35.062 ************************************ 00:07:35.062 08:25:27 -- event/cpu_locks.sh@174 -- # cleanup 00:07:35.062 08:25:27 -- event/cpu_locks.sh@15 -- # [[ -z 999772 ]] 00:07:35.062 08:25:27 -- event/cpu_locks.sh@15 -- # killprocess 999772 00:07:35.062 08:25:27 -- common/autotest_common.sh@926 -- # '[' -z 999772 ']' 00:07:35.062 08:25:27 -- common/autotest_common.sh@930 -- # kill -0 999772 00:07:35.062 08:25:27 -- common/autotest_common.sh@931 -- # uname 00:07:35.062 08:25:27 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:35.062 08:25:27 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 999772 00:07:35.062 08:25:27 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:35.062 08:25:27 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:35.062 08:25:27 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 999772' 00:07:35.062 killing process with pid 999772 00:07:35.062 08:25:27 -- common/autotest_common.sh@945 -- # kill 999772 00:07:35.063 08:25:27 -- common/autotest_common.sh@950 -- # wait 999772 00:07:35.321 08:25:27 -- event/cpu_locks.sh@16 -- # [[ -z 999882 ]] 00:07:35.321 08:25:27 -- event/cpu_locks.sh@16 -- # killprocess 999882 00:07:35.321 08:25:27 -- common/autotest_common.sh@926 -- # '[' -z 999882 ']' 00:07:35.321 08:25:27 -- common/autotest_common.sh@930 -- # kill -0 999882 00:07:35.321 08:25:27 -- common/autotest_common.sh@931 -- # uname 00:07:35.321 08:25:27 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:35.321 08:25:27 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 999882 00:07:35.580 08:25:28 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:07:35.580 08:25:28 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:07:35.580 08:25:28 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 999882' 00:07:35.580 killing process with pid 999882 00:07:35.580 08:25:28 -- common/autotest_common.sh@945 -- # kill 999882 00:07:35.580 08:25:28 -- common/autotest_common.sh@950 -- # wait 999882 00:07:35.840 08:25:28 -- event/cpu_locks.sh@18 -- # rm -f 00:07:35.840 08:25:28 -- event/cpu_locks.sh@1 -- # cleanup 00:07:35.840 08:25:28 -- event/cpu_locks.sh@15 -- # [[ -z 999772 ]] 00:07:35.840 08:25:28 -- event/cpu_locks.sh@15 -- # killprocess 999772 00:07:35.840 08:25:28 -- common/autotest_common.sh@926 -- # '[' -z 999772 ']' 00:07:35.840 08:25:28 -- common/autotest_common.sh@930 -- # kill -0 999772 00:07:35.840 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (999772) - No such process 00:07:35.840 08:25:28 -- common/autotest_common.sh@953 -- # echo 'Process with pid 999772 is not found' 00:07:35.840 Process with pid 999772 is not found 00:07:35.840 08:25:28 -- event/cpu_locks.sh@16 -- # [[ -z 999882 ]] 00:07:35.840 08:25:28 -- event/cpu_locks.sh@16 -- # killprocess 999882 00:07:35.840 08:25:28 -- common/autotest_common.sh@926 -- # '[' -z 999882 ']' 00:07:35.840 08:25:28 -- common/autotest_common.sh@930 -- # kill -0 999882 00:07:35.840 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (999882) - No such process 00:07:35.840 08:25:28 -- common/autotest_common.sh@953 -- # echo 'Process with pid 999882 is not found' 00:07:35.840 Process with pid 999882 is not found 00:07:35.840 08:25:28 -- event/cpu_locks.sh@18 -- # rm -f 00:07:35.840 00:07:35.840 real 0m17.534s 00:07:35.840 user 0m30.595s 00:07:35.840 sys 0m5.540s 00:07:35.840 08:25:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.840 08:25:28 -- common/autotest_common.sh@10 -- # set +x 00:07:35.840 ************************************ 00:07:35.840 END TEST cpu_locks 00:07:35.840 ************************************ 00:07:35.840 00:07:35.840 real 0m42.382s 00:07:35.840 user 1m20.324s 00:07:35.840 sys 0m9.501s 00:07:35.840 08:25:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.840 08:25:28 -- common/autotest_common.sh@10 -- # set +x 00:07:35.840 ************************************ 00:07:35.840 END TEST event 00:07:35.840 ************************************ 00:07:35.840 08:25:28 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:35.840 08:25:28 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:35.840 08:25:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:35.840 08:25:28 -- common/autotest_common.sh@10 -- # set +x 00:07:35.840 ************************************ 00:07:35.840 START TEST thread 00:07:35.840 ************************************ 00:07:35.840 08:25:28 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:35.840 * Looking for test storage... 00:07:36.099 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:36.099 08:25:28 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:36.099 08:25:28 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:07:36.099 08:25:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:36.099 08:25:28 -- common/autotest_common.sh@10 -- # set +x 00:07:36.099 ************************************ 00:07:36.099 START TEST thread_poller_perf 00:07:36.099 ************************************ 00:07:36.099 08:25:28 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:36.099 [2024-10-04 08:25:28.556018] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:36.099 [2024-10-04 08:25:28.556155] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1000412 ] 00:07:36.099 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.099 [2024-10-04 08:25:28.626951] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.099 [2024-10-04 08:25:28.663607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.099 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:37.478 ====================================== 00:07:37.478 busy:2505512876 (cyc) 00:07:37.478 total_run_count: 796000 00:07:37.478 tsc_hz: 2500000000 (cyc) 00:07:37.478 ====================================== 00:07:37.478 poller_cost: 3147 (cyc), 1258 (nsec) 00:07:37.478 00:07:37.478 real 0m1.183s 00:07:37.478 user 0m1.090s 00:07:37.478 sys 0m0.088s 00:07:37.478 08:25:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.478 08:25:29 -- common/autotest_common.sh@10 -- # set +x 00:07:37.478 ************************************ 00:07:37.478 END TEST thread_poller_perf 00:07:37.478 ************************************ 00:07:37.478 08:25:29 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:37.478 08:25:29 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:07:37.478 08:25:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:37.478 08:25:29 -- common/autotest_common.sh@10 -- # set +x 00:07:37.478 ************************************ 00:07:37.478 START TEST thread_poller_perf 00:07:37.478 ************************************ 00:07:37.478 08:25:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:37.478 [2024-10-04 08:25:29.786064] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:37.478 [2024-10-04 08:25:29.786161] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1000694 ] 00:07:37.478 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.479 [2024-10-04 08:25:29.855754] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.479 [2024-10-04 08:25:29.889928] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.479 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:38.416 ====================================== 00:07:38.416 busy:2501828968 (cyc) 00:07:38.416 total_run_count: 12805000 00:07:38.416 tsc_hz: 2500000000 (cyc) 00:07:38.416 ====================================== 00:07:38.416 poller_cost: 195 (cyc), 78 (nsec) 00:07:38.416 00:07:38.416 real 0m1.179s 00:07:38.416 user 0m1.086s 00:07:38.416 sys 0m0.089s 00:07:38.416 08:25:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.416 08:25:30 -- common/autotest_common.sh@10 -- # set +x 00:07:38.416 ************************************ 00:07:38.416 END TEST thread_poller_perf 00:07:38.416 ************************************ 00:07:38.416 08:25:30 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:38.416 08:25:30 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:38.416 08:25:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:38.416 08:25:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:38.416 08:25:30 -- common/autotest_common.sh@10 -- # set +x 00:07:38.416 ************************************ 00:07:38.416 START TEST thread_spdk_lock 00:07:38.416 ************************************ 00:07:38.416 08:25:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:38.416 [2024-10-04 08:25:31.015028] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:38.416 [2024-10-04 08:25:31.015120] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1000855 ] 00:07:38.416 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.416 [2024-10-04 08:25:31.085035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:38.675 [2024-10-04 08:25:31.120939] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.675 [2024-10-04 08:25:31.120943] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.952 [2024-10-04 08:25:31.603608] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:38.952 [2024-10-04 08:25:31.603647] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:38.952 [2024-10-04 08:25:31.603657] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x12e2e40 00:07:38.952 [2024-10-04 08:25:31.604457] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:38.952 [2024-10-04 08:25:31.604561] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:38.952 [2024-10-04 08:25:31.604578] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:39.211 Starting test contend 00:07:39.211 Worker Delay Wait us Hold us Total us 00:07:39.211 0 3 171104 180025 351130 00:07:39.211 1 5 85359 282192 367551 00:07:39.211 PASS test contend 00:07:39.211 Starting test hold_by_poller 00:07:39.211 PASS test hold_by_poller 00:07:39.211 Starting test hold_by_message 00:07:39.211 PASS test hold_by_message 00:07:39.211 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:39.211 100014 assertions passed 00:07:39.211 0 assertions failed 00:07:39.211 00:07:39.211 real 0m0.659s 00:07:39.211 user 0m1.053s 00:07:39.211 sys 0m0.085s 00:07:39.211 08:25:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.211 08:25:31 -- common/autotest_common.sh@10 -- # set +x 00:07:39.211 ************************************ 00:07:39.211 END TEST thread_spdk_lock 00:07:39.211 ************************************ 00:07:39.211 00:07:39.211 real 0m3.262s 00:07:39.211 user 0m3.309s 00:07:39.211 sys 0m0.454s 00:07:39.211 08:25:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.211 08:25:31 -- common/autotest_common.sh@10 -- # set +x 00:07:39.211 ************************************ 00:07:39.211 END TEST thread 00:07:39.211 ************************************ 00:07:39.211 08:25:31 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:07:39.211 08:25:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:39.211 08:25:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:39.211 08:25:31 -- common/autotest_common.sh@10 -- # set +x 00:07:39.211 ************************************ 00:07:39.211 START TEST accel 00:07:39.211 ************************************ 00:07:39.211 08:25:31 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:07:39.211 * Looking for test storage... 00:07:39.212 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:39.212 08:25:31 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:07:39.212 08:25:31 -- accel/accel.sh@74 -- # get_expected_opcs 00:07:39.212 08:25:31 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:39.212 08:25:31 -- accel/accel.sh@59 -- # spdk_tgt_pid=1001052 00:07:39.212 08:25:31 -- accel/accel.sh@60 -- # waitforlisten 1001052 00:07:39.212 08:25:31 -- common/autotest_common.sh@819 -- # '[' -z 1001052 ']' 00:07:39.212 08:25:31 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:39.212 08:25:31 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:39.212 08:25:31 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:39.212 08:25:31 -- accel/accel.sh@58 -- # build_accel_config 00:07:39.212 08:25:31 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:39.212 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:39.212 08:25:31 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:39.212 08:25:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:39.212 08:25:31 -- common/autotest_common.sh@10 -- # set +x 00:07:39.212 08:25:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.212 08:25:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.212 08:25:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:39.212 08:25:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:39.212 08:25:31 -- accel/accel.sh@41 -- # local IFS=, 00:07:39.212 08:25:31 -- accel/accel.sh@42 -- # jq -r . 00:07:39.212 [2024-10-04 08:25:31.857348] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:39.212 [2024-10-04 08:25:31.857420] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001052 ] 00:07:39.212 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.471 [2024-10-04 08:25:31.923671] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.471 [2024-10-04 08:25:31.960084] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.471 [2024-10-04 08:25:31.960202] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.039 08:25:32 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:40.039 08:25:32 -- common/autotest_common.sh@852 -- # return 0 00:07:40.039 08:25:32 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:40.039 08:25:32 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:07:40.039 08:25:32 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:40.039 08:25:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:40.039 08:25:32 -- common/autotest_common.sh@10 -- # set +x 00:07:40.039 08:25:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # IFS== 00:07:40.299 08:25:32 -- accel/accel.sh@64 -- # read -r opc module 00:07:40.299 08:25:32 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:07:40.299 08:25:32 -- accel/accel.sh@67 -- # killprocess 1001052 00:07:40.299 08:25:32 -- common/autotest_common.sh@926 -- # '[' -z 1001052 ']' 00:07:40.299 08:25:32 -- common/autotest_common.sh@930 -- # kill -0 1001052 00:07:40.299 08:25:32 -- common/autotest_common.sh@931 -- # uname 00:07:40.299 08:25:32 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:40.299 08:25:32 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1001052 00:07:40.299 08:25:32 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:40.299 08:25:32 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:40.299 08:25:32 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1001052' 00:07:40.299 killing process with pid 1001052 00:07:40.299 08:25:32 -- common/autotest_common.sh@945 -- # kill 1001052 00:07:40.299 08:25:32 -- common/autotest_common.sh@950 -- # wait 1001052 00:07:40.559 08:25:33 -- accel/accel.sh@68 -- # trap - ERR 00:07:40.559 08:25:33 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:07:40.559 08:25:33 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:07:40.559 08:25:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:40.559 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:40.559 08:25:33 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:07:40.559 08:25:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:40.559 08:25:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:40.559 08:25:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:40.559 08:25:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.559 08:25:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.559 08:25:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:40.559 08:25:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:40.559 08:25:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:40.559 08:25:33 -- accel/accel.sh@42 -- # jq -r . 00:07:40.559 08:25:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.559 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:40.559 08:25:33 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:40.559 08:25:33 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:40.559 08:25:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:40.559 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:40.559 ************************************ 00:07:40.559 START TEST accel_missing_filename 00:07:40.559 ************************************ 00:07:40.559 08:25:33 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:07:40.559 08:25:33 -- common/autotest_common.sh@640 -- # local es=0 00:07:40.559 08:25:33 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:40.559 08:25:33 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:07:40.559 08:25:33 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:40.559 08:25:33 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:07:40.559 08:25:33 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:40.559 08:25:33 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:07:40.559 08:25:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:40.559 08:25:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:40.559 08:25:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:40.559 08:25:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.559 08:25:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.559 08:25:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:40.559 08:25:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:40.559 08:25:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:40.559 08:25:33 -- accel/accel.sh@42 -- # jq -r . 00:07:40.559 [2024-10-04 08:25:33.169526] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:40.559 [2024-10-04 08:25:33.169651] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001358 ] 00:07:40.559 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.559 [2024-10-04 08:25:33.239851] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.820 [2024-10-04 08:25:33.275338] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.820 [2024-10-04 08:25:33.314617] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:40.820 [2024-10-04 08:25:33.374423] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:07:40.820 A filename is required. 00:07:40.820 08:25:33 -- common/autotest_common.sh@643 -- # es=234 00:07:40.820 08:25:33 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:40.820 08:25:33 -- common/autotest_common.sh@652 -- # es=106 00:07:40.820 08:25:33 -- common/autotest_common.sh@653 -- # case "$es" in 00:07:40.820 08:25:33 -- common/autotest_common.sh@660 -- # es=1 00:07:40.820 08:25:33 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:40.820 00:07:40.820 real 0m0.284s 00:07:40.820 user 0m0.187s 00:07:40.820 sys 0m0.133s 00:07:40.820 08:25:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.820 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:40.820 ************************************ 00:07:40.820 END TEST accel_missing_filename 00:07:40.820 ************************************ 00:07:40.820 08:25:33 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:40.820 08:25:33 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:07:40.820 08:25:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:40.820 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:40.820 ************************************ 00:07:40.820 START TEST accel_compress_verify 00:07:40.820 ************************************ 00:07:40.820 08:25:33 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:40.820 08:25:33 -- common/autotest_common.sh@640 -- # local es=0 00:07:40.820 08:25:33 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:40.820 08:25:33 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:07:40.820 08:25:33 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:40.820 08:25:33 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:07:40.820 08:25:33 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:40.820 08:25:33 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:40.820 08:25:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:40.820 08:25:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:40.820 08:25:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:40.820 08:25:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.820 08:25:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.820 08:25:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:40.820 08:25:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:40.820 08:25:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:40.820 08:25:33 -- accel/accel.sh@42 -- # jq -r . 00:07:40.820 [2024-10-04 08:25:33.490633] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:40.820 [2024-10-04 08:25:33.490740] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001379 ] 00:07:41.079 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.079 [2024-10-04 08:25:33.561457] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.079 [2024-10-04 08:25:33.596704] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.079 [2024-10-04 08:25:33.636302] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:41.079 [2024-10-04 08:25:33.696216] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:07:41.079 00:07:41.079 Compression does not support the verify option, aborting. 00:07:41.079 08:25:33 -- common/autotest_common.sh@643 -- # es=161 00:07:41.079 08:25:33 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:41.079 08:25:33 -- common/autotest_common.sh@652 -- # es=33 00:07:41.079 08:25:33 -- common/autotest_common.sh@653 -- # case "$es" in 00:07:41.079 08:25:33 -- common/autotest_common.sh@660 -- # es=1 00:07:41.079 08:25:33 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:41.079 00:07:41.079 real 0m0.283s 00:07:41.079 user 0m0.184s 00:07:41.079 sys 0m0.136s 00:07:41.079 08:25:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.079 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:41.079 ************************************ 00:07:41.079 END TEST accel_compress_verify 00:07:41.079 ************************************ 00:07:41.339 08:25:33 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:41.339 08:25:33 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:41.339 08:25:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:41.339 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:41.339 ************************************ 00:07:41.339 START TEST accel_wrong_workload 00:07:41.339 ************************************ 00:07:41.339 08:25:33 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:07:41.339 08:25:33 -- common/autotest_common.sh@640 -- # local es=0 00:07:41.339 08:25:33 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:41.339 08:25:33 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:07:41.339 08:25:33 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:41.339 08:25:33 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:07:41.339 08:25:33 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:41.339 08:25:33 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:07:41.339 08:25:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:41.339 08:25:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:41.339 08:25:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:41.339 08:25:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.339 08:25:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.339 08:25:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:41.339 08:25:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:41.339 08:25:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:41.339 08:25:33 -- accel/accel.sh@42 -- # jq -r . 00:07:41.339 Unsupported workload type: foobar 00:07:41.339 [2024-10-04 08:25:33.814114] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:41.339 accel_perf options: 00:07:41.339 [-h help message] 00:07:41.339 [-q queue depth per core] 00:07:41.339 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:41.339 [-T number of threads per core 00:07:41.339 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:41.339 [-t time in seconds] 00:07:41.339 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:41.339 [ dif_verify, , dif_generate, dif_generate_copy 00:07:41.339 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:41.339 [-l for compress/decompress workloads, name of uncompressed input file 00:07:41.339 [-S for crc32c workload, use this seed value (default 0) 00:07:41.339 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:41.339 [-f for fill workload, use this BYTE value (default 255) 00:07:41.339 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:41.339 [-y verify result if this switch is on] 00:07:41.339 [-a tasks to allocate per core (default: same value as -q)] 00:07:41.339 Can be used to spread operations across a wider range of memory. 00:07:41.339 08:25:33 -- common/autotest_common.sh@643 -- # es=1 00:07:41.339 08:25:33 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:41.339 08:25:33 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:41.339 08:25:33 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:41.339 00:07:41.339 real 0m0.025s 00:07:41.339 user 0m0.006s 00:07:41.339 sys 0m0.019s 00:07:41.339 08:25:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.339 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:41.339 ************************************ 00:07:41.339 END TEST accel_wrong_workload 00:07:41.339 ************************************ 00:07:41.339 Error: writing output failed: Broken pipe 00:07:41.339 08:25:33 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:41.339 08:25:33 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:07:41.339 08:25:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:41.339 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:41.339 ************************************ 00:07:41.339 START TEST accel_negative_buffers 00:07:41.339 ************************************ 00:07:41.339 08:25:33 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:41.339 08:25:33 -- common/autotest_common.sh@640 -- # local es=0 00:07:41.339 08:25:33 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:41.339 08:25:33 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:07:41.339 08:25:33 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:41.339 08:25:33 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:07:41.339 08:25:33 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:41.339 08:25:33 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:07:41.339 08:25:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:41.339 08:25:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:41.339 08:25:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:41.339 08:25:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.339 08:25:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.339 08:25:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:41.339 08:25:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:41.339 08:25:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:41.339 08:25:33 -- accel/accel.sh@42 -- # jq -r . 00:07:41.339 -x option must be non-negative. 00:07:41.339 [2024-10-04 08:25:33.879592] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:41.339 accel_perf options: 00:07:41.339 [-h help message] 00:07:41.339 [-q queue depth per core] 00:07:41.339 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:41.339 [-T number of threads per core 00:07:41.339 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:41.339 [-t time in seconds] 00:07:41.339 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:41.339 [ dif_verify, , dif_generate, dif_generate_copy 00:07:41.339 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:41.339 [-l for compress/decompress workloads, name of uncompressed input file 00:07:41.339 [-S for crc32c workload, use this seed value (default 0) 00:07:41.340 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:41.340 [-f for fill workload, use this BYTE value (default 255) 00:07:41.340 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:41.340 [-y verify result if this switch is on] 00:07:41.340 [-a tasks to allocate per core (default: same value as -q)] 00:07:41.340 Can be used to spread operations across a wider range of memory. 00:07:41.340 08:25:33 -- common/autotest_common.sh@643 -- # es=1 00:07:41.340 08:25:33 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:41.340 08:25:33 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:41.340 08:25:33 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:41.340 00:07:41.340 real 0m0.024s 00:07:41.340 user 0m0.008s 00:07:41.340 sys 0m0.016s 00:07:41.340 08:25:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.340 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:41.340 ************************************ 00:07:41.340 END TEST accel_negative_buffers 00:07:41.340 ************************************ 00:07:41.340 Error: writing output failed: Broken pipe 00:07:41.340 08:25:33 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:41.340 08:25:33 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:41.340 08:25:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:41.340 08:25:33 -- common/autotest_common.sh@10 -- # set +x 00:07:41.340 ************************************ 00:07:41.340 START TEST accel_crc32c 00:07:41.340 ************************************ 00:07:41.340 08:25:33 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:41.340 08:25:33 -- accel/accel.sh@16 -- # local accel_opc 00:07:41.340 08:25:33 -- accel/accel.sh@17 -- # local accel_module 00:07:41.340 08:25:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:41.340 08:25:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:41.340 08:25:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:41.340 08:25:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:41.340 08:25:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.340 08:25:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.340 08:25:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:41.340 08:25:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:41.340 08:25:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:41.340 08:25:33 -- accel/accel.sh@42 -- # jq -r . 00:07:41.340 [2024-10-04 08:25:33.934253] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:41.340 [2024-10-04 08:25:33.934328] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001482 ] 00:07:41.340 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.340 [2024-10-04 08:25:34.001365] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.599 [2024-10-04 08:25:34.038039] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.536 08:25:35 -- accel/accel.sh@18 -- # out=' 00:07:42.536 SPDK Configuration: 00:07:42.536 Core mask: 0x1 00:07:42.536 00:07:42.536 Accel Perf Configuration: 00:07:42.536 Workload Type: crc32c 00:07:42.536 CRC-32C seed: 32 00:07:42.536 Transfer size: 4096 bytes 00:07:42.536 Vector count 1 00:07:42.536 Module: software 00:07:42.536 Queue depth: 32 00:07:42.536 Allocate depth: 32 00:07:42.536 # threads/core: 1 00:07:42.536 Run time: 1 seconds 00:07:42.536 Verify: Yes 00:07:42.536 00:07:42.536 Running for 1 seconds... 00:07:42.536 00:07:42.536 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:42.536 ------------------------------------------------------------------------------------ 00:07:42.536 0,0 844320/s 3298 MiB/s 0 0 00:07:42.536 ==================================================================================== 00:07:42.536 Total 844320/s 3298 MiB/s 0 0' 00:07:42.536 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.536 08:25:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:42.536 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.536 08:25:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:42.536 08:25:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:42.536 08:25:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:42.536 08:25:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.536 08:25:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.536 08:25:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:42.536 08:25:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:42.536 08:25:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:42.536 08:25:35 -- accel/accel.sh@42 -- # jq -r . 00:07:42.536 [2024-10-04 08:25:35.205699] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:42.536 [2024-10-04 08:25:35.205751] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001711 ] 00:07:42.796 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.796 [2024-10-04 08:25:35.268057] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.796 [2024-10-04 08:25:35.302068] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val= 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val= 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val=0x1 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val= 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val= 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val=crc32c 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val=32 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val= 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val=software 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@23 -- # accel_module=software 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val=32 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val=32 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val=1 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val=Yes 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val= 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:42.796 08:25:35 -- accel/accel.sh@21 -- # val= 00:07:42.796 08:25:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # IFS=: 00:07:42.796 08:25:35 -- accel/accel.sh@20 -- # read -r var val 00:07:44.174 08:25:36 -- accel/accel.sh@21 -- # val= 00:07:44.174 08:25:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # IFS=: 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # read -r var val 00:07:44.174 08:25:36 -- accel/accel.sh@21 -- # val= 00:07:44.174 08:25:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # IFS=: 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # read -r var val 00:07:44.174 08:25:36 -- accel/accel.sh@21 -- # val= 00:07:44.174 08:25:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # IFS=: 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # read -r var val 00:07:44.174 08:25:36 -- accel/accel.sh@21 -- # val= 00:07:44.174 08:25:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # IFS=: 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # read -r var val 00:07:44.174 08:25:36 -- accel/accel.sh@21 -- # val= 00:07:44.174 08:25:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # IFS=: 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # read -r var val 00:07:44.174 08:25:36 -- accel/accel.sh@21 -- # val= 00:07:44.174 08:25:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # IFS=: 00:07:44.174 08:25:36 -- accel/accel.sh@20 -- # read -r var val 00:07:44.174 08:25:36 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:44.174 08:25:36 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:07:44.174 08:25:36 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.174 00:07:44.174 real 0m2.539s 00:07:44.174 user 0m2.303s 00:07:44.174 sys 0m0.235s 00:07:44.174 08:25:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.174 08:25:36 -- common/autotest_common.sh@10 -- # set +x 00:07:44.174 ************************************ 00:07:44.174 END TEST accel_crc32c 00:07:44.174 ************************************ 00:07:44.174 08:25:36 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:44.174 08:25:36 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:44.174 08:25:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:44.174 08:25:36 -- common/autotest_common.sh@10 -- # set +x 00:07:44.174 ************************************ 00:07:44.174 START TEST accel_crc32c_C2 00:07:44.174 ************************************ 00:07:44.174 08:25:36 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:44.174 08:25:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:44.174 08:25:36 -- accel/accel.sh@17 -- # local accel_module 00:07:44.174 08:25:36 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:44.174 08:25:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:44.174 08:25:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:44.174 08:25:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:44.174 08:25:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.174 08:25:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.174 08:25:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:44.174 08:25:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:44.174 08:25:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:44.174 08:25:36 -- accel/accel.sh@42 -- # jq -r . 00:07:44.174 [2024-10-04 08:25:36.524783] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:44.174 [2024-10-04 08:25:36.524878] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001994 ] 00:07:44.174 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.174 [2024-10-04 08:25:36.592079] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.174 [2024-10-04 08:25:36.626972] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.110 08:25:37 -- accel/accel.sh@18 -- # out=' 00:07:45.110 SPDK Configuration: 00:07:45.110 Core mask: 0x1 00:07:45.110 00:07:45.110 Accel Perf Configuration: 00:07:45.110 Workload Type: crc32c 00:07:45.110 CRC-32C seed: 0 00:07:45.110 Transfer size: 4096 bytes 00:07:45.110 Vector count 2 00:07:45.110 Module: software 00:07:45.110 Queue depth: 32 00:07:45.110 Allocate depth: 32 00:07:45.110 # threads/core: 1 00:07:45.110 Run time: 1 seconds 00:07:45.110 Verify: Yes 00:07:45.110 00:07:45.110 Running for 1 seconds... 00:07:45.110 00:07:45.110 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:45.110 ------------------------------------------------------------------------------------ 00:07:45.110 0,0 611840/s 4780 MiB/s 0 0 00:07:45.110 ==================================================================================== 00:07:45.110 Total 611840/s 2390 MiB/s 0 0' 00:07:45.110 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.110 08:25:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:45.110 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.110 08:25:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:45.110 08:25:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:45.110 08:25:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:45.110 08:25:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.110 08:25:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.110 08:25:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:45.110 08:25:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:45.110 08:25:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:45.110 08:25:37 -- accel/accel.sh@42 -- # jq -r . 00:07:45.370 [2024-10-04 08:25:37.793430] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:45.370 [2024-10-04 08:25:37.793483] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1002261 ] 00:07:45.370 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.370 [2024-10-04 08:25:37.856594] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.370 [2024-10-04 08:25:37.890735] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val= 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val= 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val=0x1 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val= 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val= 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val=crc32c 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val=0 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val= 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val=software 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@23 -- # accel_module=software 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val=32 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val=32 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val=1 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val=Yes 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val= 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:45.370 08:25:37 -- accel/accel.sh@21 -- # val= 00:07:45.370 08:25:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # IFS=: 00:07:45.370 08:25:37 -- accel/accel.sh@20 -- # read -r var val 00:07:46.751 08:25:39 -- accel/accel.sh@21 -- # val= 00:07:46.751 08:25:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.751 08:25:39 -- accel/accel.sh@21 -- # val= 00:07:46.751 08:25:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.751 08:25:39 -- accel/accel.sh@21 -- # val= 00:07:46.751 08:25:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.751 08:25:39 -- accel/accel.sh@21 -- # val= 00:07:46.751 08:25:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.751 08:25:39 -- accel/accel.sh@21 -- # val= 00:07:46.751 08:25:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.751 08:25:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.751 08:25:39 -- accel/accel.sh@21 -- # val= 00:07:46.752 08:25:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.752 08:25:39 -- accel/accel.sh@20 -- # IFS=: 00:07:46.752 08:25:39 -- accel/accel.sh@20 -- # read -r var val 00:07:46.752 08:25:39 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:46.752 08:25:39 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:07:46.752 08:25:39 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:46.752 00:07:46.752 real 0m2.545s 00:07:46.752 user 0m2.293s 00:07:46.752 sys 0m0.252s 00:07:46.752 08:25:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.752 08:25:39 -- common/autotest_common.sh@10 -- # set +x 00:07:46.752 ************************************ 00:07:46.752 END TEST accel_crc32c_C2 00:07:46.752 ************************************ 00:07:46.752 08:25:39 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:46.752 08:25:39 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:46.752 08:25:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:46.752 08:25:39 -- common/autotest_common.sh@10 -- # set +x 00:07:46.752 ************************************ 00:07:46.752 START TEST accel_copy 00:07:46.752 ************************************ 00:07:46.752 08:25:39 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:07:46.752 08:25:39 -- accel/accel.sh@16 -- # local accel_opc 00:07:46.752 08:25:39 -- accel/accel.sh@17 -- # local accel_module 00:07:46.752 08:25:39 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:07:46.752 08:25:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:46.752 08:25:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:46.752 08:25:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:46.752 08:25:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.752 08:25:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.752 08:25:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:46.752 08:25:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:46.752 08:25:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:46.752 08:25:39 -- accel/accel.sh@42 -- # jq -r . 00:07:46.752 [2024-10-04 08:25:39.106849] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:46.752 [2024-10-04 08:25:39.106941] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1002544 ] 00:07:46.752 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.752 [2024-10-04 08:25:39.176511] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.752 [2024-10-04 08:25:39.211490] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.130 08:25:40 -- accel/accel.sh@18 -- # out=' 00:07:48.130 SPDK Configuration: 00:07:48.130 Core mask: 0x1 00:07:48.130 00:07:48.130 Accel Perf Configuration: 00:07:48.130 Workload Type: copy 00:07:48.130 Transfer size: 4096 bytes 00:07:48.130 Vector count 1 00:07:48.130 Module: software 00:07:48.130 Queue depth: 32 00:07:48.130 Allocate depth: 32 00:07:48.130 # threads/core: 1 00:07:48.130 Run time: 1 seconds 00:07:48.130 Verify: Yes 00:07:48.130 00:07:48.130 Running for 1 seconds... 00:07:48.130 00:07:48.130 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:48.130 ------------------------------------------------------------------------------------ 00:07:48.130 0,0 555808/s 2171 MiB/s 0 0 00:07:48.130 ==================================================================================== 00:07:48.130 Total 555808/s 2171 MiB/s 0 0' 00:07:48.130 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.130 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.130 08:25:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:48.130 08:25:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:48.130 08:25:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:48.130 08:25:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:48.130 08:25:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:48.130 08:25:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:48.130 08:25:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:48.130 08:25:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:48.130 08:25:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:48.130 08:25:40 -- accel/accel.sh@42 -- # jq -r . 00:07:48.130 [2024-10-04 08:25:40.392806] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:48.130 [2024-10-04 08:25:40.392897] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1002729 ] 00:07:48.130 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.130 [2024-10-04 08:25:40.462164] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.130 [2024-10-04 08:25:40.496969] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.130 08:25:40 -- accel/accel.sh@21 -- # val= 00:07:48.130 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.130 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.130 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.130 08:25:40 -- accel/accel.sh@21 -- # val= 00:07:48.130 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.130 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.130 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val=0x1 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val= 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val= 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val=copy 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@24 -- # accel_opc=copy 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val= 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val=software 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@23 -- # accel_module=software 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val=32 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val=32 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val=1 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val=Yes 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val= 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:48.131 08:25:40 -- accel/accel.sh@21 -- # val= 00:07:48.131 08:25:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # IFS=: 00:07:48.131 08:25:40 -- accel/accel.sh@20 -- # read -r var val 00:07:49.070 08:25:41 -- accel/accel.sh@21 -- # val= 00:07:49.070 08:25:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # IFS=: 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # read -r var val 00:07:49.070 08:25:41 -- accel/accel.sh@21 -- # val= 00:07:49.070 08:25:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # IFS=: 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # read -r var val 00:07:49.070 08:25:41 -- accel/accel.sh@21 -- # val= 00:07:49.070 08:25:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # IFS=: 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # read -r var val 00:07:49.070 08:25:41 -- accel/accel.sh@21 -- # val= 00:07:49.070 08:25:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # IFS=: 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # read -r var val 00:07:49.070 08:25:41 -- accel/accel.sh@21 -- # val= 00:07:49.070 08:25:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # IFS=: 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # read -r var val 00:07:49.070 08:25:41 -- accel/accel.sh@21 -- # val= 00:07:49.070 08:25:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # IFS=: 00:07:49.070 08:25:41 -- accel/accel.sh@20 -- # read -r var val 00:07:49.070 08:25:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:49.070 08:25:41 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:07:49.070 08:25:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.070 00:07:49.070 real 0m2.574s 00:07:49.070 user 0m2.325s 00:07:49.070 sys 0m0.249s 00:07:49.070 08:25:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:49.070 08:25:41 -- common/autotest_common.sh@10 -- # set +x 00:07:49.071 ************************************ 00:07:49.071 END TEST accel_copy 00:07:49.071 ************************************ 00:07:49.071 08:25:41 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:49.071 08:25:41 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:49.071 08:25:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:49.071 08:25:41 -- common/autotest_common.sh@10 -- # set +x 00:07:49.071 ************************************ 00:07:49.071 START TEST accel_fill 00:07:49.071 ************************************ 00:07:49.071 08:25:41 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:49.071 08:25:41 -- accel/accel.sh@16 -- # local accel_opc 00:07:49.071 08:25:41 -- accel/accel.sh@17 -- # local accel_module 00:07:49.071 08:25:41 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:49.071 08:25:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:49.071 08:25:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:49.071 08:25:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:49.071 08:25:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.071 08:25:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.071 08:25:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:49.071 08:25:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:49.071 08:25:41 -- accel/accel.sh@41 -- # local IFS=, 00:07:49.071 08:25:41 -- accel/accel.sh@42 -- # jq -r . 00:07:49.071 [2024-10-04 08:25:41.717532] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:49.071 [2024-10-04 08:25:41.717618] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1002904 ] 00:07:49.330 EAL: No free 2048 kB hugepages reported on node 1 00:07:49.330 [2024-10-04 08:25:41.786059] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.330 [2024-10-04 08:25:41.821617] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.710 08:25:42 -- accel/accel.sh@18 -- # out=' 00:07:50.710 SPDK Configuration: 00:07:50.710 Core mask: 0x1 00:07:50.710 00:07:50.710 Accel Perf Configuration: 00:07:50.710 Workload Type: fill 00:07:50.710 Fill pattern: 0x80 00:07:50.710 Transfer size: 4096 bytes 00:07:50.710 Vector count 1 00:07:50.710 Module: software 00:07:50.710 Queue depth: 64 00:07:50.710 Allocate depth: 64 00:07:50.710 # threads/core: 1 00:07:50.710 Run time: 1 seconds 00:07:50.710 Verify: Yes 00:07:50.710 00:07:50.710 Running for 1 seconds... 00:07:50.710 00:07:50.710 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:50.710 ------------------------------------------------------------------------------------ 00:07:50.710 0,0 971968/s 3796 MiB/s 0 0 00:07:50.710 ==================================================================================== 00:07:50.710 Total 971968/s 3796 MiB/s 0 0' 00:07:50.710 08:25:42 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:50.710 08:25:42 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:50.710 08:25:42 -- accel/accel.sh@12 -- # build_accel_config 00:07:50.710 08:25:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:50.710 08:25:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:50.710 08:25:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:50.710 08:25:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:50.710 08:25:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:50.710 08:25:42 -- accel/accel.sh@41 -- # local IFS=, 00:07:50.710 08:25:42 -- accel/accel.sh@42 -- # jq -r . 00:07:50.710 [2024-10-04 08:25:42.990090] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:50.710 [2024-10-04 08:25:42.990145] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1003123 ] 00:07:50.710 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.710 [2024-10-04 08:25:43.052206] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.710 [2024-10-04 08:25:43.086091] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val= 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val= 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val=0x1 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val= 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val= 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val=fill 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@24 -- # accel_opc=fill 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val=0x80 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val= 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val=software 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@23 -- # accel_module=software 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val=64 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val=64 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val=1 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val=Yes 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val= 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:50.710 08:25:43 -- accel/accel.sh@21 -- # val= 00:07:50.710 08:25:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # IFS=: 00:07:50.710 08:25:43 -- accel/accel.sh@20 -- # read -r var val 00:07:51.648 08:25:44 -- accel/accel.sh@21 -- # val= 00:07:51.648 08:25:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # IFS=: 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # read -r var val 00:07:51.648 08:25:44 -- accel/accel.sh@21 -- # val= 00:07:51.648 08:25:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # IFS=: 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # read -r var val 00:07:51.648 08:25:44 -- accel/accel.sh@21 -- # val= 00:07:51.648 08:25:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # IFS=: 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # read -r var val 00:07:51.648 08:25:44 -- accel/accel.sh@21 -- # val= 00:07:51.648 08:25:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # IFS=: 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # read -r var val 00:07:51.648 08:25:44 -- accel/accel.sh@21 -- # val= 00:07:51.648 08:25:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # IFS=: 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # read -r var val 00:07:51.648 08:25:44 -- accel/accel.sh@21 -- # val= 00:07:51.648 08:25:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # IFS=: 00:07:51.648 08:25:44 -- accel/accel.sh@20 -- # read -r var val 00:07:51.648 08:25:44 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:51.648 08:25:44 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:07:51.648 08:25:44 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:51.648 00:07:51.648 real 0m2.551s 00:07:51.648 user 0m2.299s 00:07:51.648 sys 0m0.251s 00:07:51.649 08:25:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:51.649 08:25:44 -- common/autotest_common.sh@10 -- # set +x 00:07:51.649 ************************************ 00:07:51.649 END TEST accel_fill 00:07:51.649 ************************************ 00:07:51.649 08:25:44 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:51.649 08:25:44 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:51.649 08:25:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:51.649 08:25:44 -- common/autotest_common.sh@10 -- # set +x 00:07:51.649 ************************************ 00:07:51.649 START TEST accel_copy_crc32c 00:07:51.649 ************************************ 00:07:51.649 08:25:44 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:07:51.649 08:25:44 -- accel/accel.sh@16 -- # local accel_opc 00:07:51.649 08:25:44 -- accel/accel.sh@17 -- # local accel_module 00:07:51.649 08:25:44 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:51.649 08:25:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:51.649 08:25:44 -- accel/accel.sh@12 -- # build_accel_config 00:07:51.649 08:25:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:51.649 08:25:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.649 08:25:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.649 08:25:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:51.649 08:25:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:51.649 08:25:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:51.649 08:25:44 -- accel/accel.sh@42 -- # jq -r . 00:07:51.649 [2024-10-04 08:25:44.306915] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:51.649 [2024-10-04 08:25:44.307001] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1003404 ] 00:07:51.908 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.908 [2024-10-04 08:25:44.375041] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.908 [2024-10-04 08:25:44.410481] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.286 08:25:45 -- accel/accel.sh@18 -- # out=' 00:07:53.286 SPDK Configuration: 00:07:53.286 Core mask: 0x1 00:07:53.286 00:07:53.286 Accel Perf Configuration: 00:07:53.286 Workload Type: copy_crc32c 00:07:53.286 CRC-32C seed: 0 00:07:53.286 Vector size: 4096 bytes 00:07:53.286 Transfer size: 4096 bytes 00:07:53.286 Vector count 1 00:07:53.286 Module: software 00:07:53.286 Queue depth: 32 00:07:53.286 Allocate depth: 32 00:07:53.286 # threads/core: 1 00:07:53.286 Run time: 1 seconds 00:07:53.286 Verify: Yes 00:07:53.286 00:07:53.286 Running for 1 seconds... 00:07:53.286 00:07:53.286 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:53.286 ------------------------------------------------------------------------------------ 00:07:53.286 0,0 418848/s 1636 MiB/s 0 0 00:07:53.286 ==================================================================================== 00:07:53.286 Total 418848/s 1636 MiB/s 0 0' 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:53.286 08:25:45 -- accel/accel.sh@12 -- # build_accel_config 00:07:53.286 08:25:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:53.286 08:25:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:53.286 08:25:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:53.286 08:25:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:53.286 08:25:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:53.286 08:25:45 -- accel/accel.sh@41 -- # local IFS=, 00:07:53.286 08:25:45 -- accel/accel.sh@42 -- # jq -r . 00:07:53.286 [2024-10-04 08:25:45.578638] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:53.286 [2024-10-04 08:25:45.578692] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1003678 ] 00:07:53.286 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.286 [2024-10-04 08:25:45.641641] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.286 [2024-10-04 08:25:45.676088] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val= 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val= 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val=0x1 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val= 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val= 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val=copy_crc32c 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val=0 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val= 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val=software 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@23 -- # accel_module=software 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val=32 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val=32 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val=1 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val=Yes 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val= 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:53.286 08:25:45 -- accel/accel.sh@21 -- # val= 00:07:53.286 08:25:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # IFS=: 00:07:53.286 08:25:45 -- accel/accel.sh@20 -- # read -r var val 00:07:54.222 08:25:46 -- accel/accel.sh@21 -- # val= 00:07:54.222 08:25:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:54.222 08:25:46 -- accel/accel.sh@20 -- # IFS=: 00:07:54.222 08:25:46 -- accel/accel.sh@20 -- # read -r var val 00:07:54.222 08:25:46 -- accel/accel.sh@21 -- # val= 00:07:54.222 08:25:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:54.222 08:25:46 -- accel/accel.sh@20 -- # IFS=: 00:07:54.222 08:25:46 -- accel/accel.sh@20 -- # read -r var val 00:07:54.222 08:25:46 -- accel/accel.sh@21 -- # val= 00:07:54.222 08:25:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:54.222 08:25:46 -- accel/accel.sh@20 -- # IFS=: 00:07:54.222 08:25:46 -- accel/accel.sh@20 -- # read -r var val 00:07:54.223 08:25:46 -- accel/accel.sh@21 -- # val= 00:07:54.223 08:25:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:54.223 08:25:46 -- accel/accel.sh@20 -- # IFS=: 00:07:54.223 08:25:46 -- accel/accel.sh@20 -- # read -r var val 00:07:54.223 08:25:46 -- accel/accel.sh@21 -- # val= 00:07:54.223 08:25:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:54.223 08:25:46 -- accel/accel.sh@20 -- # IFS=: 00:07:54.223 08:25:46 -- accel/accel.sh@20 -- # read -r var val 00:07:54.223 08:25:46 -- accel/accel.sh@21 -- # val= 00:07:54.223 08:25:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:54.223 08:25:46 -- accel/accel.sh@20 -- # IFS=: 00:07:54.223 08:25:46 -- accel/accel.sh@20 -- # read -r var val 00:07:54.223 08:25:46 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:54.223 08:25:46 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:07:54.223 08:25:46 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:54.223 00:07:54.223 real 0m2.551s 00:07:54.223 user 0m2.313s 00:07:54.223 sys 0m0.237s 00:07:54.223 08:25:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.223 08:25:46 -- common/autotest_common.sh@10 -- # set +x 00:07:54.223 ************************************ 00:07:54.223 END TEST accel_copy_crc32c 00:07:54.223 ************************************ 00:07:54.223 08:25:46 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:54.223 08:25:46 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:54.223 08:25:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:54.223 08:25:46 -- common/autotest_common.sh@10 -- # set +x 00:07:54.223 ************************************ 00:07:54.223 START TEST accel_copy_crc32c_C2 00:07:54.223 ************************************ 00:07:54.223 08:25:46 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:54.223 08:25:46 -- accel/accel.sh@16 -- # local accel_opc 00:07:54.223 08:25:46 -- accel/accel.sh@17 -- # local accel_module 00:07:54.223 08:25:46 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:54.223 08:25:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:54.223 08:25:46 -- accel/accel.sh@12 -- # build_accel_config 00:07:54.223 08:25:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:54.223 08:25:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:54.223 08:25:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:54.223 08:25:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:54.223 08:25:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:54.223 08:25:46 -- accel/accel.sh@41 -- # local IFS=, 00:07:54.223 08:25:46 -- accel/accel.sh@42 -- # jq -r . 00:07:54.223 [2024-10-04 08:25:46.895613] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:54.223 [2024-10-04 08:25:46.895700] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1003973 ] 00:07:54.481 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.481 [2024-10-04 08:25:46.963608] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.481 [2024-10-04 08:25:46.998483] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.505 08:25:48 -- accel/accel.sh@18 -- # out=' 00:07:55.505 SPDK Configuration: 00:07:55.505 Core mask: 0x1 00:07:55.505 00:07:55.505 Accel Perf Configuration: 00:07:55.505 Workload Type: copy_crc32c 00:07:55.505 CRC-32C seed: 0 00:07:55.505 Vector size: 4096 bytes 00:07:55.505 Transfer size: 8192 bytes 00:07:55.505 Vector count 2 00:07:55.505 Module: software 00:07:55.505 Queue depth: 32 00:07:55.505 Allocate depth: 32 00:07:55.505 # threads/core: 1 00:07:55.505 Run time: 1 seconds 00:07:55.505 Verify: Yes 00:07:55.505 00:07:55.505 Running for 1 seconds... 00:07:55.505 00:07:55.505 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:55.505 ------------------------------------------------------------------------------------ 00:07:55.505 0,0 301856/s 2358 MiB/s 0 0 00:07:55.505 ==================================================================================== 00:07:55.505 Total 301856/s 1179 MiB/s 0 0' 00:07:55.505 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.505 08:25:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:55.505 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.505 08:25:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:55.505 08:25:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:55.505 08:25:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:55.505 08:25:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:55.505 08:25:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:55.505 08:25:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:55.505 08:25:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:55.505 08:25:48 -- accel/accel.sh@41 -- # local IFS=, 00:07:55.505 08:25:48 -- accel/accel.sh@42 -- # jq -r . 00:07:55.505 [2024-10-04 08:25:48.167445] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:55.505 [2024-10-04 08:25:48.167499] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1004187 ] 00:07:55.764 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.764 [2024-10-04 08:25:48.229173] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.764 [2024-10-04 08:25:48.263936] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val= 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val= 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val=0x1 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val= 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val= 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val=copy_crc32c 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val=0 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val='8192 bytes' 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val= 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val=software 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@23 -- # accel_module=software 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val=32 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val=32 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val=1 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val=Yes 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val= 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:55.764 08:25:48 -- accel/accel.sh@21 -- # val= 00:07:55.764 08:25:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # IFS=: 00:07:55.764 08:25:48 -- accel/accel.sh@20 -- # read -r var val 00:07:57.139 08:25:49 -- accel/accel.sh@21 -- # val= 00:07:57.139 08:25:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # IFS=: 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # read -r var val 00:07:57.139 08:25:49 -- accel/accel.sh@21 -- # val= 00:07:57.139 08:25:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # IFS=: 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # read -r var val 00:07:57.139 08:25:49 -- accel/accel.sh@21 -- # val= 00:07:57.139 08:25:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # IFS=: 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # read -r var val 00:07:57.139 08:25:49 -- accel/accel.sh@21 -- # val= 00:07:57.139 08:25:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # IFS=: 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # read -r var val 00:07:57.139 08:25:49 -- accel/accel.sh@21 -- # val= 00:07:57.139 08:25:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # IFS=: 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # read -r var val 00:07:57.139 08:25:49 -- accel/accel.sh@21 -- # val= 00:07:57.139 08:25:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # IFS=: 00:07:57.139 08:25:49 -- accel/accel.sh@20 -- # read -r var val 00:07:57.139 08:25:49 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:57.139 08:25:49 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:07:57.139 08:25:49 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:57.139 00:07:57.139 real 0m2.552s 00:07:57.139 user 0m2.305s 00:07:57.139 sys 0m0.247s 00:07:57.139 08:25:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.139 08:25:49 -- common/autotest_common.sh@10 -- # set +x 00:07:57.139 ************************************ 00:07:57.139 END TEST accel_copy_crc32c_C2 00:07:57.139 ************************************ 00:07:57.139 08:25:49 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:57.139 08:25:49 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:57.139 08:25:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:57.139 08:25:49 -- common/autotest_common.sh@10 -- # set +x 00:07:57.139 ************************************ 00:07:57.139 START TEST accel_dualcast 00:07:57.139 ************************************ 00:07:57.139 08:25:49 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:07:57.139 08:25:49 -- accel/accel.sh@16 -- # local accel_opc 00:07:57.139 08:25:49 -- accel/accel.sh@17 -- # local accel_module 00:07:57.139 08:25:49 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:07:57.139 08:25:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:57.139 08:25:49 -- accel/accel.sh@12 -- # build_accel_config 00:07:57.139 08:25:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:57.139 08:25:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:57.139 08:25:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:57.139 08:25:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:57.139 08:25:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:57.139 08:25:49 -- accel/accel.sh@41 -- # local IFS=, 00:07:57.139 08:25:49 -- accel/accel.sh@42 -- # jq -r . 00:07:57.139 [2024-10-04 08:25:49.484914] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:57.139 [2024-10-04 08:25:49.485028] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1004348 ] 00:07:57.139 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.139 [2024-10-04 08:25:49.554030] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.139 [2024-10-04 08:25:49.589243] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.074 08:25:50 -- accel/accel.sh@18 -- # out=' 00:07:58.074 SPDK Configuration: 00:07:58.074 Core mask: 0x1 00:07:58.074 00:07:58.074 Accel Perf Configuration: 00:07:58.074 Workload Type: dualcast 00:07:58.074 Transfer size: 4096 bytes 00:07:58.074 Vector count 1 00:07:58.074 Module: software 00:07:58.074 Queue depth: 32 00:07:58.074 Allocate depth: 32 00:07:58.074 # threads/core: 1 00:07:58.074 Run time: 1 seconds 00:07:58.074 Verify: Yes 00:07:58.074 00:07:58.074 Running for 1 seconds... 00:07:58.074 00:07:58.074 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:58.074 ------------------------------------------------------------------------------------ 00:07:58.074 0,0 631392/s 2466 MiB/s 0 0 00:07:58.074 ==================================================================================== 00:07:58.074 Total 631392/s 2466 MiB/s 0 0' 00:07:58.074 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.074 08:25:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:58.074 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.074 08:25:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:58.074 08:25:50 -- accel/accel.sh@12 -- # build_accel_config 00:07:58.074 08:25:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:58.074 08:25:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:58.074 08:25:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:58.074 08:25:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:58.074 08:25:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:58.074 08:25:50 -- accel/accel.sh@41 -- # local IFS=, 00:07:58.074 08:25:50 -- accel/accel.sh@42 -- # jq -r . 00:07:58.334 [2024-10-04 08:25:50.757526] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:58.334 [2024-10-04 08:25:50.757581] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1004553 ] 00:07:58.334 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.334 [2024-10-04 08:25:50.818427] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.334 [2024-10-04 08:25:50.852678] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val= 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val= 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val=0x1 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val= 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val= 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val=dualcast 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val= 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val=software 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@23 -- # accel_module=software 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val=32 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val=32 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val=1 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val=Yes 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val= 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:58.334 08:25:50 -- accel/accel.sh@21 -- # val= 00:07:58.334 08:25:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # IFS=: 00:07:58.334 08:25:50 -- accel/accel.sh@20 -- # read -r var val 00:07:59.713 08:25:52 -- accel/accel.sh@21 -- # val= 00:07:59.713 08:25:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # IFS=: 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # read -r var val 00:07:59.713 08:25:52 -- accel/accel.sh@21 -- # val= 00:07:59.713 08:25:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # IFS=: 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # read -r var val 00:07:59.713 08:25:52 -- accel/accel.sh@21 -- # val= 00:07:59.713 08:25:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # IFS=: 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # read -r var val 00:07:59.713 08:25:52 -- accel/accel.sh@21 -- # val= 00:07:59.713 08:25:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # IFS=: 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # read -r var val 00:07:59.713 08:25:52 -- accel/accel.sh@21 -- # val= 00:07:59.713 08:25:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # IFS=: 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # read -r var val 00:07:59.713 08:25:52 -- accel/accel.sh@21 -- # val= 00:07:59.713 08:25:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # IFS=: 00:07:59.713 08:25:52 -- accel/accel.sh@20 -- # read -r var val 00:07:59.713 08:25:52 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:59.713 08:25:52 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:07:59.713 08:25:52 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:59.713 00:07:59.713 real 0m2.550s 00:07:59.713 user 0m2.304s 00:07:59.713 sys 0m0.244s 00:07:59.713 08:25:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.713 08:25:52 -- common/autotest_common.sh@10 -- # set +x 00:07:59.713 ************************************ 00:07:59.713 END TEST accel_dualcast 00:07:59.713 ************************************ 00:07:59.713 08:25:52 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:59.713 08:25:52 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:59.713 08:25:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:59.713 08:25:52 -- common/autotest_common.sh@10 -- # set +x 00:07:59.713 ************************************ 00:07:59.713 START TEST accel_compare 00:07:59.713 ************************************ 00:07:59.713 08:25:52 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:07:59.713 08:25:52 -- accel/accel.sh@16 -- # local accel_opc 00:07:59.713 08:25:52 -- accel/accel.sh@17 -- # local accel_module 00:07:59.713 08:25:52 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:07:59.713 08:25:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:59.713 08:25:52 -- accel/accel.sh@12 -- # build_accel_config 00:07:59.713 08:25:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:59.713 08:25:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:59.713 08:25:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:59.713 08:25:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:59.713 08:25:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:59.713 08:25:52 -- accel/accel.sh@41 -- # local IFS=, 00:07:59.713 08:25:52 -- accel/accel.sh@42 -- # jq -r . 00:07:59.713 [2024-10-04 08:25:52.068577] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:59.713 [2024-10-04 08:25:52.068664] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1004837 ] 00:07:59.713 EAL: No free 2048 kB hugepages reported on node 1 00:07:59.713 [2024-10-04 08:25:52.136810] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.713 [2024-10-04 08:25:52.171415] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.650 08:25:53 -- accel/accel.sh@18 -- # out=' 00:08:00.650 SPDK Configuration: 00:08:00.650 Core mask: 0x1 00:08:00.650 00:08:00.650 Accel Perf Configuration: 00:08:00.650 Workload Type: compare 00:08:00.650 Transfer size: 4096 bytes 00:08:00.650 Vector count 1 00:08:00.650 Module: software 00:08:00.650 Queue depth: 32 00:08:00.650 Allocate depth: 32 00:08:00.650 # threads/core: 1 00:08:00.650 Run time: 1 seconds 00:08:00.650 Verify: Yes 00:08:00.650 00:08:00.650 Running for 1 seconds... 00:08:00.650 00:08:00.650 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:00.650 ------------------------------------------------------------------------------------ 00:08:00.650 0,0 828416/s 3236 MiB/s 0 0 00:08:00.650 ==================================================================================== 00:08:00.650 Total 828416/s 3236 MiB/s 0 0' 00:08:00.909 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.909 08:25:53 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:00.909 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.909 08:25:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:00.909 08:25:53 -- accel/accel.sh@12 -- # build_accel_config 00:08:00.909 08:25:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:00.909 08:25:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.909 08:25:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.909 08:25:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:00.909 08:25:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:00.909 08:25:53 -- accel/accel.sh@41 -- # local IFS=, 00:08:00.909 08:25:53 -- accel/accel.sh@42 -- # jq -r . 00:08:00.909 [2024-10-04 08:25:53.349083] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:00.909 [2024-10-04 08:25:53.349177] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1005103 ] 00:08:00.909 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.909 [2024-10-04 08:25:53.416301] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.909 [2024-10-04 08:25:53.449054] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.909 08:25:53 -- accel/accel.sh@21 -- # val= 00:08:00.909 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.909 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.909 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.909 08:25:53 -- accel/accel.sh@21 -- # val= 00:08:00.909 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.909 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.909 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val=0x1 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val= 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val= 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val=compare 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@24 -- # accel_opc=compare 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val= 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val=software 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@23 -- # accel_module=software 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val=32 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val=32 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val=1 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val=Yes 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val= 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:00.910 08:25:53 -- accel/accel.sh@21 -- # val= 00:08:00.910 08:25:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # IFS=: 00:08:00.910 08:25:53 -- accel/accel.sh@20 -- # read -r var val 00:08:02.286 08:25:54 -- accel/accel.sh@21 -- # val= 00:08:02.286 08:25:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # IFS=: 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # read -r var val 00:08:02.286 08:25:54 -- accel/accel.sh@21 -- # val= 00:08:02.286 08:25:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # IFS=: 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # read -r var val 00:08:02.286 08:25:54 -- accel/accel.sh@21 -- # val= 00:08:02.286 08:25:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # IFS=: 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # read -r var val 00:08:02.286 08:25:54 -- accel/accel.sh@21 -- # val= 00:08:02.286 08:25:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # IFS=: 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # read -r var val 00:08:02.286 08:25:54 -- accel/accel.sh@21 -- # val= 00:08:02.286 08:25:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # IFS=: 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # read -r var val 00:08:02.286 08:25:54 -- accel/accel.sh@21 -- # val= 00:08:02.286 08:25:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # IFS=: 00:08:02.286 08:25:54 -- accel/accel.sh@20 -- # read -r var val 00:08:02.286 08:25:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:02.286 08:25:54 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:08:02.286 08:25:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:02.286 00:08:02.286 real 0m2.561s 00:08:02.286 user 0m2.311s 00:08:02.286 sys 0m0.248s 00:08:02.286 08:25:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.286 08:25:54 -- common/autotest_common.sh@10 -- # set +x 00:08:02.286 ************************************ 00:08:02.286 END TEST accel_compare 00:08:02.286 ************************************ 00:08:02.286 08:25:54 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:02.286 08:25:54 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:02.286 08:25:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:02.286 08:25:54 -- common/autotest_common.sh@10 -- # set +x 00:08:02.286 ************************************ 00:08:02.286 START TEST accel_xor 00:08:02.286 ************************************ 00:08:02.286 08:25:54 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:08:02.286 08:25:54 -- accel/accel.sh@16 -- # local accel_opc 00:08:02.286 08:25:54 -- accel/accel.sh@17 -- # local accel_module 00:08:02.286 08:25:54 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:08:02.286 08:25:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:02.286 08:25:54 -- accel/accel.sh@12 -- # build_accel_config 00:08:02.286 08:25:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:02.286 08:25:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.286 08:25:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.286 08:25:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:02.286 08:25:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:02.286 08:25:54 -- accel/accel.sh@41 -- # local IFS=, 00:08:02.286 08:25:54 -- accel/accel.sh@42 -- # jq -r . 00:08:02.286 [2024-10-04 08:25:54.669397] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:02.286 [2024-10-04 08:25:54.669506] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1005392 ] 00:08:02.286 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.286 [2024-10-04 08:25:54.737131] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.286 [2024-10-04 08:25:54.771860] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.666 08:25:55 -- accel/accel.sh@18 -- # out=' 00:08:03.666 SPDK Configuration: 00:08:03.666 Core mask: 0x1 00:08:03.666 00:08:03.666 Accel Perf Configuration: 00:08:03.666 Workload Type: xor 00:08:03.666 Source buffers: 2 00:08:03.666 Transfer size: 4096 bytes 00:08:03.666 Vector count 1 00:08:03.666 Module: software 00:08:03.666 Queue depth: 32 00:08:03.666 Allocate depth: 32 00:08:03.666 # threads/core: 1 00:08:03.666 Run time: 1 seconds 00:08:03.666 Verify: Yes 00:08:03.666 00:08:03.666 Running for 1 seconds... 00:08:03.666 00:08:03.666 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:03.666 ------------------------------------------------------------------------------------ 00:08:03.666 0,0 716576/s 2799 MiB/s 0 0 00:08:03.666 ==================================================================================== 00:08:03.666 Total 716576/s 2799 MiB/s 0 0' 00:08:03.666 08:25:55 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:55 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:03.666 08:25:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:03.666 08:25:55 -- accel/accel.sh@12 -- # build_accel_config 00:08:03.666 08:25:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:03.666 08:25:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.666 08:25:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.666 08:25:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:03.666 08:25:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:03.666 08:25:55 -- accel/accel.sh@41 -- # local IFS=, 00:08:03.666 08:25:55 -- accel/accel.sh@42 -- # jq -r . 00:08:03.666 [2024-10-04 08:25:55.940314] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:03.666 [2024-10-04 08:25:55.940367] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1005605 ] 00:08:03.666 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.666 [2024-10-04 08:25:56.002109] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.666 [2024-10-04 08:25:56.036172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val= 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val= 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val=0x1 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val= 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val= 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val=xor 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@24 -- # accel_opc=xor 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val=2 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val= 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val=software 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@23 -- # accel_module=software 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val=32 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val=32 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val=1 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val=Yes 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val= 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:03.666 08:25:56 -- accel/accel.sh@21 -- # val= 00:08:03.666 08:25:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # IFS=: 00:08:03.666 08:25:56 -- accel/accel.sh@20 -- # read -r var val 00:08:04.602 08:25:57 -- accel/accel.sh@21 -- # val= 00:08:04.602 08:25:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # IFS=: 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # read -r var val 00:08:04.602 08:25:57 -- accel/accel.sh@21 -- # val= 00:08:04.602 08:25:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # IFS=: 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # read -r var val 00:08:04.602 08:25:57 -- accel/accel.sh@21 -- # val= 00:08:04.602 08:25:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # IFS=: 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # read -r var val 00:08:04.602 08:25:57 -- accel/accel.sh@21 -- # val= 00:08:04.602 08:25:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # IFS=: 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # read -r var val 00:08:04.602 08:25:57 -- accel/accel.sh@21 -- # val= 00:08:04.602 08:25:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # IFS=: 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # read -r var val 00:08:04.602 08:25:57 -- accel/accel.sh@21 -- # val= 00:08:04.602 08:25:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # IFS=: 00:08:04.602 08:25:57 -- accel/accel.sh@20 -- # read -r var val 00:08:04.602 08:25:57 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:04.602 08:25:57 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:08:04.602 08:25:57 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:04.602 00:08:04.602 real 0m2.548s 00:08:04.602 user 0m2.309s 00:08:04.602 sys 0m0.239s 00:08:04.602 08:25:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:04.602 08:25:57 -- common/autotest_common.sh@10 -- # set +x 00:08:04.602 ************************************ 00:08:04.602 END TEST accel_xor 00:08:04.602 ************************************ 00:08:04.602 08:25:57 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:04.602 08:25:57 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:04.602 08:25:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:04.602 08:25:57 -- common/autotest_common.sh@10 -- # set +x 00:08:04.602 ************************************ 00:08:04.602 START TEST accel_xor 00:08:04.602 ************************************ 00:08:04.602 08:25:57 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:08:04.602 08:25:57 -- accel/accel.sh@16 -- # local accel_opc 00:08:04.602 08:25:57 -- accel/accel.sh@17 -- # local accel_module 00:08:04.602 08:25:57 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:08:04.602 08:25:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:04.602 08:25:57 -- accel/accel.sh@12 -- # build_accel_config 00:08:04.602 08:25:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:04.602 08:25:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:04.602 08:25:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:04.602 08:25:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:04.602 08:25:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:04.602 08:25:57 -- accel/accel.sh@41 -- # local IFS=, 00:08:04.602 08:25:57 -- accel/accel.sh@42 -- # jq -r . 00:08:04.602 [2024-10-04 08:25:57.242483] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:04.602 [2024-10-04 08:25:57.242537] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1005773 ] 00:08:04.602 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.861 [2024-10-04 08:25:57.303744] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.861 [2024-10-04 08:25:57.338629] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.237 08:25:58 -- accel/accel.sh@18 -- # out=' 00:08:06.237 SPDK Configuration: 00:08:06.237 Core mask: 0x1 00:08:06.237 00:08:06.237 Accel Perf Configuration: 00:08:06.237 Workload Type: xor 00:08:06.237 Source buffers: 3 00:08:06.237 Transfer size: 4096 bytes 00:08:06.237 Vector count 1 00:08:06.237 Module: software 00:08:06.237 Queue depth: 32 00:08:06.237 Allocate depth: 32 00:08:06.237 # threads/core: 1 00:08:06.237 Run time: 1 seconds 00:08:06.237 Verify: Yes 00:08:06.237 00:08:06.237 Running for 1 seconds... 00:08:06.237 00:08:06.237 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:06.237 ------------------------------------------------------------------------------------ 00:08:06.237 0,0 674880/s 2636 MiB/s 0 0 00:08:06.237 ==================================================================================== 00:08:06.237 Total 674880/s 2636 MiB/s 0 0' 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:06.237 08:25:58 -- accel/accel.sh@12 -- # build_accel_config 00:08:06.237 08:25:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:06.237 08:25:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:06.237 08:25:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:06.237 08:25:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:06.237 08:25:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:06.237 08:25:58 -- accel/accel.sh@41 -- # local IFS=, 00:08:06.237 08:25:58 -- accel/accel.sh@42 -- # jq -r . 00:08:06.237 [2024-10-04 08:25:58.517731] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:06.237 [2024-10-04 08:25:58.517845] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1005964 ] 00:08:06.237 EAL: No free 2048 kB hugepages reported on node 1 00:08:06.237 [2024-10-04 08:25:58.586215] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.237 [2024-10-04 08:25:58.621242] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val= 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val= 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val=0x1 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val= 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val= 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val=xor 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@24 -- # accel_opc=xor 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val=3 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val= 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val=software 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@23 -- # accel_module=software 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val=32 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val=32 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val=1 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val=Yes 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val= 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:06.237 08:25:58 -- accel/accel.sh@21 -- # val= 00:08:06.237 08:25:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # IFS=: 00:08:06.237 08:25:58 -- accel/accel.sh@20 -- # read -r var val 00:08:07.174 08:25:59 -- accel/accel.sh@21 -- # val= 00:08:07.174 08:25:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # IFS=: 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # read -r var val 00:08:07.174 08:25:59 -- accel/accel.sh@21 -- # val= 00:08:07.174 08:25:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # IFS=: 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # read -r var val 00:08:07.174 08:25:59 -- accel/accel.sh@21 -- # val= 00:08:07.174 08:25:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # IFS=: 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # read -r var val 00:08:07.174 08:25:59 -- accel/accel.sh@21 -- # val= 00:08:07.174 08:25:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # IFS=: 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # read -r var val 00:08:07.174 08:25:59 -- accel/accel.sh@21 -- # val= 00:08:07.174 08:25:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # IFS=: 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # read -r var val 00:08:07.174 08:25:59 -- accel/accel.sh@21 -- # val= 00:08:07.174 08:25:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # IFS=: 00:08:07.174 08:25:59 -- accel/accel.sh@20 -- # read -r var val 00:08:07.174 08:25:59 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:07.174 08:25:59 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:08:07.174 08:25:59 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:07.174 00:08:07.174 real 0m2.549s 00:08:07.174 user 0m2.301s 00:08:07.174 sys 0m0.247s 00:08:07.174 08:25:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.174 08:25:59 -- common/autotest_common.sh@10 -- # set +x 00:08:07.174 ************************************ 00:08:07.174 END TEST accel_xor 00:08:07.174 ************************************ 00:08:07.174 08:25:59 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:07.174 08:25:59 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:07.174 08:25:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:07.174 08:25:59 -- common/autotest_common.sh@10 -- # set +x 00:08:07.174 ************************************ 00:08:07.174 START TEST accel_dif_verify 00:08:07.174 ************************************ 00:08:07.174 08:25:59 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:08:07.174 08:25:59 -- accel/accel.sh@16 -- # local accel_opc 00:08:07.174 08:25:59 -- accel/accel.sh@17 -- # local accel_module 00:08:07.174 08:25:59 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:08:07.174 08:25:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:07.174 08:25:59 -- accel/accel.sh@12 -- # build_accel_config 00:08:07.174 08:25:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:07.174 08:25:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.174 08:25:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.174 08:25:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:07.174 08:25:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:07.174 08:25:59 -- accel/accel.sh@41 -- # local IFS=, 00:08:07.174 08:25:59 -- accel/accel.sh@42 -- # jq -r . 00:08:07.174 [2024-10-04 08:25:59.828669] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:07.174 [2024-10-04 08:25:59.828721] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1006253 ] 00:08:07.433 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.433 [2024-10-04 08:25:59.890624] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.433 [2024-10-04 08:25:59.925268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.810 08:26:01 -- accel/accel.sh@18 -- # out=' 00:08:08.810 SPDK Configuration: 00:08:08.810 Core mask: 0x1 00:08:08.810 00:08:08.810 Accel Perf Configuration: 00:08:08.810 Workload Type: dif_verify 00:08:08.810 Vector size: 4096 bytes 00:08:08.810 Transfer size: 4096 bytes 00:08:08.810 Block size: 512 bytes 00:08:08.810 Metadata size: 8 bytes 00:08:08.810 Vector count 1 00:08:08.810 Module: software 00:08:08.810 Queue depth: 32 00:08:08.810 Allocate depth: 32 00:08:08.810 # threads/core: 1 00:08:08.810 Run time: 1 seconds 00:08:08.810 Verify: No 00:08:08.810 00:08:08.810 Running for 1 seconds... 00:08:08.810 00:08:08.810 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:08.810 ------------------------------------------------------------------------------------ 00:08:08.810 0,0 239008/s 948 MiB/s 0 0 00:08:08.810 ==================================================================================== 00:08:08.810 Total 239008/s 933 MiB/s 0 0' 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:08.810 08:26:01 -- accel/accel.sh@12 -- # build_accel_config 00:08:08.810 08:26:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:08.810 08:26:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.810 08:26:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.810 08:26:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:08.810 08:26:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:08.810 08:26:01 -- accel/accel.sh@41 -- # local IFS=, 00:08:08.810 08:26:01 -- accel/accel.sh@42 -- # jq -r . 00:08:08.810 [2024-10-04 08:26:01.094049] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:08.810 [2024-10-04 08:26:01.094103] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1006524 ] 00:08:08.810 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.810 [2024-10-04 08:26:01.155946] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.810 [2024-10-04 08:26:01.190333] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val= 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val= 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val=0x1 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val= 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val= 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val=dif_verify 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val='512 bytes' 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val='8 bytes' 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val= 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val=software 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@23 -- # accel_module=software 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val=32 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val=32 00:08:08.810 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.810 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.810 08:26:01 -- accel/accel.sh@21 -- # val=1 00:08:08.811 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.811 08:26:01 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:08.811 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.811 08:26:01 -- accel/accel.sh@21 -- # val=No 00:08:08.811 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.811 08:26:01 -- accel/accel.sh@21 -- # val= 00:08:08.811 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:08.811 08:26:01 -- accel/accel.sh@21 -- # val= 00:08:08.811 08:26:01 -- accel/accel.sh@22 -- # case "$var" in 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # IFS=: 00:08:08.811 08:26:01 -- accel/accel.sh@20 -- # read -r var val 00:08:09.746 08:26:02 -- accel/accel.sh@21 -- # val= 00:08:09.746 08:26:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # IFS=: 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # read -r var val 00:08:09.746 08:26:02 -- accel/accel.sh@21 -- # val= 00:08:09.746 08:26:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # IFS=: 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # read -r var val 00:08:09.746 08:26:02 -- accel/accel.sh@21 -- # val= 00:08:09.746 08:26:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # IFS=: 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # read -r var val 00:08:09.746 08:26:02 -- accel/accel.sh@21 -- # val= 00:08:09.746 08:26:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # IFS=: 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # read -r var val 00:08:09.746 08:26:02 -- accel/accel.sh@21 -- # val= 00:08:09.746 08:26:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # IFS=: 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # read -r var val 00:08:09.746 08:26:02 -- accel/accel.sh@21 -- # val= 00:08:09.746 08:26:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # IFS=: 00:08:09.746 08:26:02 -- accel/accel.sh@20 -- # read -r var val 00:08:09.746 08:26:02 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:09.746 08:26:02 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:08:09.746 08:26:02 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:09.746 00:08:09.746 real 0m2.531s 00:08:09.746 user 0m2.297s 00:08:09.746 sys 0m0.234s 00:08:09.746 08:26:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:09.746 08:26:02 -- common/autotest_common.sh@10 -- # set +x 00:08:09.746 ************************************ 00:08:09.746 END TEST accel_dif_verify 00:08:09.746 ************************************ 00:08:09.746 08:26:02 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:09.746 08:26:02 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:09.746 08:26:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:09.746 08:26:02 -- common/autotest_common.sh@10 -- # set +x 00:08:09.746 ************************************ 00:08:09.746 START TEST accel_dif_generate 00:08:09.746 ************************************ 00:08:09.746 08:26:02 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:08:09.746 08:26:02 -- accel/accel.sh@16 -- # local accel_opc 00:08:09.746 08:26:02 -- accel/accel.sh@17 -- # local accel_module 00:08:09.746 08:26:02 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:08:09.746 08:26:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:09.746 08:26:02 -- accel/accel.sh@12 -- # build_accel_config 00:08:09.746 08:26:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:09.746 08:26:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:09.746 08:26:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:09.746 08:26:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:09.746 08:26:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:09.746 08:26:02 -- accel/accel.sh@41 -- # local IFS=, 00:08:09.746 08:26:02 -- accel/accel.sh@42 -- # jq -r . 00:08:09.746 [2024-10-04 08:26:02.408628] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:09.746 [2024-10-04 08:26:02.408718] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1006805 ] 00:08:10.004 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.004 [2024-10-04 08:26:02.476609] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.004 [2024-10-04 08:26:02.511371] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.382 08:26:03 -- accel/accel.sh@18 -- # out=' 00:08:11.382 SPDK Configuration: 00:08:11.382 Core mask: 0x1 00:08:11.382 00:08:11.382 Accel Perf Configuration: 00:08:11.382 Workload Type: dif_generate 00:08:11.382 Vector size: 4096 bytes 00:08:11.382 Transfer size: 4096 bytes 00:08:11.382 Block size: 512 bytes 00:08:11.382 Metadata size: 8 bytes 00:08:11.382 Vector count 1 00:08:11.382 Module: software 00:08:11.382 Queue depth: 32 00:08:11.382 Allocate depth: 32 00:08:11.382 # threads/core: 1 00:08:11.382 Run time: 1 seconds 00:08:11.382 Verify: No 00:08:11.382 00:08:11.382 Running for 1 seconds... 00:08:11.382 00:08:11.382 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:11.382 ------------------------------------------------------------------------------------ 00:08:11.382 0,0 282944/s 1122 MiB/s 0 0 00:08:11.382 ==================================================================================== 00:08:11.382 Total 282944/s 1105 MiB/s 0 0' 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.382 08:26:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.382 08:26:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:11.382 08:26:03 -- accel/accel.sh@12 -- # build_accel_config 00:08:11.382 08:26:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:11.382 08:26:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.382 08:26:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.382 08:26:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:11.382 08:26:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:11.382 08:26:03 -- accel/accel.sh@41 -- # local IFS=, 00:08:11.382 08:26:03 -- accel/accel.sh@42 -- # jq -r . 00:08:11.382 [2024-10-04 08:26:03.680015] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:11.382 [2024-10-04 08:26:03.680072] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1007029 ] 00:08:11.382 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.382 [2024-10-04 08:26:03.742551] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.382 [2024-10-04 08:26:03.776654] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.382 08:26:03 -- accel/accel.sh@21 -- # val= 00:08:11.382 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.382 08:26:03 -- accel/accel.sh@21 -- # val= 00:08:11.382 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.382 08:26:03 -- accel/accel.sh@21 -- # val=0x1 00:08:11.382 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.382 08:26:03 -- accel/accel.sh@21 -- # val= 00:08:11.382 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.382 08:26:03 -- accel/accel.sh@21 -- # val= 00:08:11.382 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.382 08:26:03 -- accel/accel.sh@21 -- # val=dif_generate 00:08:11.382 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.382 08:26:03 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:08:11.382 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val='512 bytes' 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val='8 bytes' 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val= 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val=software 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@23 -- # accel_module=software 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val=32 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val=32 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val=1 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val=No 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val= 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:11.383 08:26:03 -- accel/accel.sh@21 -- # val= 00:08:11.383 08:26:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # IFS=: 00:08:11.383 08:26:03 -- accel/accel.sh@20 -- # read -r var val 00:08:12.320 08:26:04 -- accel/accel.sh@21 -- # val= 00:08:12.320 08:26:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # IFS=: 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # read -r var val 00:08:12.320 08:26:04 -- accel/accel.sh@21 -- # val= 00:08:12.320 08:26:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # IFS=: 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # read -r var val 00:08:12.320 08:26:04 -- accel/accel.sh@21 -- # val= 00:08:12.320 08:26:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # IFS=: 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # read -r var val 00:08:12.320 08:26:04 -- accel/accel.sh@21 -- # val= 00:08:12.320 08:26:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # IFS=: 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # read -r var val 00:08:12.320 08:26:04 -- accel/accel.sh@21 -- # val= 00:08:12.320 08:26:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # IFS=: 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # read -r var val 00:08:12.320 08:26:04 -- accel/accel.sh@21 -- # val= 00:08:12.320 08:26:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # IFS=: 00:08:12.320 08:26:04 -- accel/accel.sh@20 -- # read -r var val 00:08:12.320 08:26:04 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:12.320 08:26:04 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:08:12.320 08:26:04 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:12.320 00:08:12.320 real 0m2.548s 00:08:12.320 user 0m2.320s 00:08:12.320 sys 0m0.228s 00:08:12.320 08:26:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.320 08:26:04 -- common/autotest_common.sh@10 -- # set +x 00:08:12.320 ************************************ 00:08:12.320 END TEST accel_dif_generate 00:08:12.320 ************************************ 00:08:12.320 08:26:04 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:12.320 08:26:04 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:12.320 08:26:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:12.320 08:26:04 -- common/autotest_common.sh@10 -- # set +x 00:08:12.320 ************************************ 00:08:12.320 START TEST accel_dif_generate_copy 00:08:12.320 ************************************ 00:08:12.320 08:26:04 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:08:12.320 08:26:04 -- accel/accel.sh@16 -- # local accel_opc 00:08:12.320 08:26:04 -- accel/accel.sh@17 -- # local accel_module 00:08:12.320 08:26:04 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:08:12.320 08:26:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:12.320 08:26:04 -- accel/accel.sh@12 -- # build_accel_config 00:08:12.320 08:26:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:12.320 08:26:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:12.320 08:26:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:12.320 08:26:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:12.320 08:26:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:12.320 08:26:04 -- accel/accel.sh@41 -- # local IFS=, 00:08:12.320 08:26:04 -- accel/accel.sh@42 -- # jq -r . 00:08:12.320 [2024-10-04 08:26:04.996649] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:12.320 [2024-10-04 08:26:04.996739] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1007197 ] 00:08:12.579 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.579 [2024-10-04 08:26:05.065407] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.579 [2024-10-04 08:26:05.100375] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.956 08:26:06 -- accel/accel.sh@18 -- # out=' 00:08:13.956 SPDK Configuration: 00:08:13.956 Core mask: 0x1 00:08:13.956 00:08:13.956 Accel Perf Configuration: 00:08:13.956 Workload Type: dif_generate_copy 00:08:13.956 Vector size: 4096 bytes 00:08:13.956 Transfer size: 4096 bytes 00:08:13.956 Vector count 1 00:08:13.956 Module: software 00:08:13.956 Queue depth: 32 00:08:13.956 Allocate depth: 32 00:08:13.956 # threads/core: 1 00:08:13.956 Run time: 1 seconds 00:08:13.956 Verify: No 00:08:13.956 00:08:13.956 Running for 1 seconds... 00:08:13.956 00:08:13.956 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:13.956 ------------------------------------------------------------------------------------ 00:08:13.956 0,0 228800/s 907 MiB/s 0 0 00:08:13.956 ==================================================================================== 00:08:13.956 Total 228800/s 893 MiB/s 0 0' 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:13.956 08:26:06 -- accel/accel.sh@12 -- # build_accel_config 00:08:13.956 08:26:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:13.956 08:26:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.956 08:26:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.956 08:26:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:13.956 08:26:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:13.956 08:26:06 -- accel/accel.sh@41 -- # local IFS=, 00:08:13.956 08:26:06 -- accel/accel.sh@42 -- # jq -r . 00:08:13.956 [2024-10-04 08:26:06.278685] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:13.956 [2024-10-04 08:26:06.278776] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1007381 ] 00:08:13.956 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.956 [2024-10-04 08:26:06.347654] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.956 [2024-10-04 08:26:06.381532] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val= 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val= 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val=0x1 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val= 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val= 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val= 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val=software 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@23 -- # accel_module=software 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val=32 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.956 08:26:06 -- accel/accel.sh@21 -- # val=32 00:08:13.956 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.956 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.957 08:26:06 -- accel/accel.sh@21 -- # val=1 00:08:13.957 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.957 08:26:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:13.957 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.957 08:26:06 -- accel/accel.sh@21 -- # val=No 00:08:13.957 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.957 08:26:06 -- accel/accel.sh@21 -- # val= 00:08:13.957 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:13.957 08:26:06 -- accel/accel.sh@21 -- # val= 00:08:13.957 08:26:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # IFS=: 00:08:13.957 08:26:06 -- accel/accel.sh@20 -- # read -r var val 00:08:14.893 08:26:07 -- accel/accel.sh@21 -- # val= 00:08:14.893 08:26:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # IFS=: 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # read -r var val 00:08:14.893 08:26:07 -- accel/accel.sh@21 -- # val= 00:08:14.893 08:26:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # IFS=: 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # read -r var val 00:08:14.893 08:26:07 -- accel/accel.sh@21 -- # val= 00:08:14.893 08:26:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # IFS=: 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # read -r var val 00:08:14.893 08:26:07 -- accel/accel.sh@21 -- # val= 00:08:14.893 08:26:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # IFS=: 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # read -r var val 00:08:14.893 08:26:07 -- accel/accel.sh@21 -- # val= 00:08:14.893 08:26:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # IFS=: 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # read -r var val 00:08:14.893 08:26:07 -- accel/accel.sh@21 -- # val= 00:08:14.893 08:26:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # IFS=: 00:08:14.893 08:26:07 -- accel/accel.sh@20 -- # read -r var val 00:08:14.893 08:26:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:14.893 08:26:07 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:08:14.893 08:26:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:14.893 00:08:14.893 real 0m2.566s 00:08:14.893 user 0m2.313s 00:08:14.893 sys 0m0.252s 00:08:14.893 08:26:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.893 08:26:07 -- common/autotest_common.sh@10 -- # set +x 00:08:14.893 ************************************ 00:08:14.893 END TEST accel_dif_generate_copy 00:08:14.893 ************************************ 00:08:15.151 08:26:07 -- accel/accel.sh@107 -- # [[ y == y ]] 00:08:15.151 08:26:07 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:15.151 08:26:07 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:08:15.151 08:26:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:15.151 08:26:07 -- common/autotest_common.sh@10 -- # set +x 00:08:15.151 ************************************ 00:08:15.151 START TEST accel_comp 00:08:15.151 ************************************ 00:08:15.151 08:26:07 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:15.151 08:26:07 -- accel/accel.sh@16 -- # local accel_opc 00:08:15.151 08:26:07 -- accel/accel.sh@17 -- # local accel_module 00:08:15.151 08:26:07 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:15.151 08:26:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:15.151 08:26:07 -- accel/accel.sh@12 -- # build_accel_config 00:08:15.151 08:26:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:15.151 08:26:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:15.151 08:26:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:15.151 08:26:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:15.151 08:26:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:15.151 08:26:07 -- accel/accel.sh@41 -- # local IFS=, 00:08:15.151 08:26:07 -- accel/accel.sh@42 -- # jq -r . 00:08:15.151 [2024-10-04 08:26:07.589461] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:15.151 [2024-10-04 08:26:07.589522] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1007662 ] 00:08:15.151 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.151 [2024-10-04 08:26:07.648317] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.151 [2024-10-04 08:26:07.683279] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.528 08:26:08 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:16.528 00:08:16.528 SPDK Configuration: 00:08:16.528 Core mask: 0x1 00:08:16.528 00:08:16.528 Accel Perf Configuration: 00:08:16.528 Workload Type: compress 00:08:16.528 Transfer size: 4096 bytes 00:08:16.528 Vector count 1 00:08:16.528 Module: software 00:08:16.528 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:16.528 Queue depth: 32 00:08:16.528 Allocate depth: 32 00:08:16.528 # threads/core: 1 00:08:16.528 Run time: 1 seconds 00:08:16.528 Verify: No 00:08:16.528 00:08:16.528 Running for 1 seconds... 00:08:16.528 00:08:16.528 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:16.528 ------------------------------------------------------------------------------------ 00:08:16.528 0,0 67520/s 281 MiB/s 0 0 00:08:16.528 ==================================================================================== 00:08:16.528 Total 67520/s 263 MiB/s 0 0' 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.528 08:26:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.528 08:26:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:16.528 08:26:08 -- accel/accel.sh@12 -- # build_accel_config 00:08:16.528 08:26:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:16.528 08:26:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:16.528 08:26:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:16.528 08:26:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:16.528 08:26:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:16.528 08:26:08 -- accel/accel.sh@41 -- # local IFS=, 00:08:16.528 08:26:08 -- accel/accel.sh@42 -- # jq -r . 00:08:16.528 [2024-10-04 08:26:08.853083] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:16.528 [2024-10-04 08:26:08.853137] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1007928 ] 00:08:16.528 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.528 [2024-10-04 08:26:08.915411] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.528 [2024-10-04 08:26:08.949575] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.528 08:26:08 -- accel/accel.sh@21 -- # val= 00:08:16.528 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.528 08:26:08 -- accel/accel.sh@21 -- # val= 00:08:16.528 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.528 08:26:08 -- accel/accel.sh@21 -- # val= 00:08:16.528 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.528 08:26:08 -- accel/accel.sh@21 -- # val=0x1 00:08:16.528 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.528 08:26:08 -- accel/accel.sh@21 -- # val= 00:08:16.528 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.528 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val= 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val=compress 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@24 -- # accel_opc=compress 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val= 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val=software 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@23 -- # accel_module=software 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val=32 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val=32 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val=1 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val=No 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val= 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:16.529 08:26:08 -- accel/accel.sh@21 -- # val= 00:08:16.529 08:26:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # IFS=: 00:08:16.529 08:26:08 -- accel/accel.sh@20 -- # read -r var val 00:08:17.464 08:26:10 -- accel/accel.sh@21 -- # val= 00:08:17.464 08:26:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:17.464 08:26:10 -- accel/accel.sh@20 -- # IFS=: 00:08:17.464 08:26:10 -- accel/accel.sh@20 -- # read -r var val 00:08:17.464 08:26:10 -- accel/accel.sh@21 -- # val= 00:08:17.464 08:26:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:17.464 08:26:10 -- accel/accel.sh@20 -- # IFS=: 00:08:17.464 08:26:10 -- accel/accel.sh@20 -- # read -r var val 00:08:17.464 08:26:10 -- accel/accel.sh@21 -- # val= 00:08:17.464 08:26:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:17.464 08:26:10 -- accel/accel.sh@20 -- # IFS=: 00:08:17.464 08:26:10 -- accel/accel.sh@20 -- # read -r var val 00:08:17.464 08:26:10 -- accel/accel.sh@21 -- # val= 00:08:17.464 08:26:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:17.465 08:26:10 -- accel/accel.sh@20 -- # IFS=: 00:08:17.465 08:26:10 -- accel/accel.sh@20 -- # read -r var val 00:08:17.465 08:26:10 -- accel/accel.sh@21 -- # val= 00:08:17.465 08:26:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:17.465 08:26:10 -- accel/accel.sh@20 -- # IFS=: 00:08:17.465 08:26:10 -- accel/accel.sh@20 -- # read -r var val 00:08:17.465 08:26:10 -- accel/accel.sh@21 -- # val= 00:08:17.465 08:26:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:17.465 08:26:10 -- accel/accel.sh@20 -- # IFS=: 00:08:17.465 08:26:10 -- accel/accel.sh@20 -- # read -r var val 00:08:17.465 08:26:10 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:17.465 08:26:10 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:08:17.465 08:26:10 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:17.465 00:08:17.465 real 0m2.536s 00:08:17.465 user 0m2.307s 00:08:17.465 sys 0m0.227s 00:08:17.465 08:26:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.465 08:26:10 -- common/autotest_common.sh@10 -- # set +x 00:08:17.465 ************************************ 00:08:17.465 END TEST accel_comp 00:08:17.465 ************************************ 00:08:17.724 08:26:10 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:17.724 08:26:10 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:17.724 08:26:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:17.724 08:26:10 -- common/autotest_common.sh@10 -- # set +x 00:08:17.724 ************************************ 00:08:17.724 START TEST accel_decomp 00:08:17.724 ************************************ 00:08:17.724 08:26:10 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:17.724 08:26:10 -- accel/accel.sh@16 -- # local accel_opc 00:08:17.724 08:26:10 -- accel/accel.sh@17 -- # local accel_module 00:08:17.724 08:26:10 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:17.724 08:26:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:17.724 08:26:10 -- accel/accel.sh@12 -- # build_accel_config 00:08:17.724 08:26:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:17.724 08:26:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.724 08:26:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.724 08:26:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:17.724 08:26:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:17.724 08:26:10 -- accel/accel.sh@41 -- # local IFS=, 00:08:17.724 08:26:10 -- accel/accel.sh@42 -- # jq -r . 00:08:17.724 [2024-10-04 08:26:10.172451] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:17.724 [2024-10-04 08:26:10.172540] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1008225 ] 00:08:17.724 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.724 [2024-10-04 08:26:10.241019] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.724 [2024-10-04 08:26:10.276040] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.099 08:26:11 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:19.099 00:08:19.099 SPDK Configuration: 00:08:19.099 Core mask: 0x1 00:08:19.099 00:08:19.099 Accel Perf Configuration: 00:08:19.099 Workload Type: decompress 00:08:19.099 Transfer size: 4096 bytes 00:08:19.099 Vector count 1 00:08:19.099 Module: software 00:08:19.099 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:19.099 Queue depth: 32 00:08:19.099 Allocate depth: 32 00:08:19.099 # threads/core: 1 00:08:19.099 Run time: 1 seconds 00:08:19.099 Verify: Yes 00:08:19.099 00:08:19.099 Running for 1 seconds... 00:08:19.099 00:08:19.099 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:19.099 ------------------------------------------------------------------------------------ 00:08:19.099 0,0 91456/s 168 MiB/s 0 0 00:08:19.099 ==================================================================================== 00:08:19.100 Total 91456/s 357 MiB/s 0 0' 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.100 08:26:11 -- accel/accel.sh@12 -- # build_accel_config 00:08:19.100 08:26:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:19.100 08:26:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.100 08:26:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.100 08:26:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:19.100 08:26:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:19.100 08:26:11 -- accel/accel.sh@41 -- # local IFS=, 00:08:19.100 08:26:11 -- accel/accel.sh@42 -- # jq -r . 00:08:19.100 [2024-10-04 08:26:11.446376] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:19.100 [2024-10-04 08:26:11.446431] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1008452 ] 00:08:19.100 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.100 [2024-10-04 08:26:11.508813] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.100 [2024-10-04 08:26:11.542261] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val= 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val= 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val= 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val=0x1 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val= 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val= 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val=decompress 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@24 -- # accel_opc=decompress 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val= 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val=software 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@23 -- # accel_module=software 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val=32 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val=32 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val=1 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val=Yes 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val= 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:19.100 08:26:11 -- accel/accel.sh@21 -- # val= 00:08:19.100 08:26:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # IFS=: 00:08:19.100 08:26:11 -- accel/accel.sh@20 -- # read -r var val 00:08:20.034 08:26:12 -- accel/accel.sh@21 -- # val= 00:08:20.034 08:26:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # IFS=: 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # read -r var val 00:08:20.034 08:26:12 -- accel/accel.sh@21 -- # val= 00:08:20.034 08:26:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # IFS=: 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # read -r var val 00:08:20.034 08:26:12 -- accel/accel.sh@21 -- # val= 00:08:20.034 08:26:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # IFS=: 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # read -r var val 00:08:20.034 08:26:12 -- accel/accel.sh@21 -- # val= 00:08:20.034 08:26:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # IFS=: 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # read -r var val 00:08:20.034 08:26:12 -- accel/accel.sh@21 -- # val= 00:08:20.034 08:26:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # IFS=: 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # read -r var val 00:08:20.034 08:26:12 -- accel/accel.sh@21 -- # val= 00:08:20.034 08:26:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # IFS=: 00:08:20.034 08:26:12 -- accel/accel.sh@20 -- # read -r var val 00:08:20.034 08:26:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:20.034 08:26:12 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:08:20.034 08:26:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:20.034 00:08:20.034 real 0m2.553s 00:08:20.034 user 0m2.315s 00:08:20.034 sys 0m0.237s 00:08:20.034 08:26:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.034 08:26:12 -- common/autotest_common.sh@10 -- # set +x 00:08:20.035 ************************************ 00:08:20.035 END TEST accel_decomp 00:08:20.035 ************************************ 00:08:20.293 08:26:12 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:20.293 08:26:12 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:08:20.293 08:26:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:20.293 08:26:12 -- common/autotest_common.sh@10 -- # set +x 00:08:20.293 ************************************ 00:08:20.293 START TEST accel_decmop_full 00:08:20.293 ************************************ 00:08:20.293 08:26:12 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:20.293 08:26:12 -- accel/accel.sh@16 -- # local accel_opc 00:08:20.293 08:26:12 -- accel/accel.sh@17 -- # local accel_module 00:08:20.293 08:26:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:20.293 08:26:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:20.293 08:26:12 -- accel/accel.sh@12 -- # build_accel_config 00:08:20.293 08:26:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:20.293 08:26:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.293 08:26:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.293 08:26:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:20.293 08:26:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:20.293 08:26:12 -- accel/accel.sh@41 -- # local IFS=, 00:08:20.293 08:26:12 -- accel/accel.sh@42 -- # jq -r . 00:08:20.293 [2024-10-04 08:26:12.759690] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:20.293 [2024-10-04 08:26:12.759778] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1008616 ] 00:08:20.293 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.293 [2024-10-04 08:26:12.828133] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.293 [2024-10-04 08:26:12.862608] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.669 08:26:14 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:21.669 00:08:21.669 SPDK Configuration: 00:08:21.669 Core mask: 0x1 00:08:21.669 00:08:21.669 Accel Perf Configuration: 00:08:21.669 Workload Type: decompress 00:08:21.669 Transfer size: 111250 bytes 00:08:21.669 Vector count 1 00:08:21.669 Module: software 00:08:21.669 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:21.669 Queue depth: 32 00:08:21.669 Allocate depth: 32 00:08:21.669 # threads/core: 1 00:08:21.669 Run time: 1 seconds 00:08:21.669 Verify: Yes 00:08:21.669 00:08:21.669 Running for 1 seconds... 00:08:21.669 00:08:21.669 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:21.669 ------------------------------------------------------------------------------------ 00:08:21.669 0,0 5984/s 247 MiB/s 0 0 00:08:21.669 ==================================================================================== 00:08:21.669 Total 5984/s 634 MiB/s 0 0' 00:08:21.669 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.669 08:26:14 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:21.669 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.669 08:26:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:21.669 08:26:14 -- accel/accel.sh@12 -- # build_accel_config 00:08:21.669 08:26:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:21.669 08:26:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.669 08:26:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.669 08:26:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:21.669 08:26:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:21.669 08:26:14 -- accel/accel.sh@41 -- # local IFS=, 00:08:21.669 08:26:14 -- accel/accel.sh@42 -- # jq -r . 00:08:21.669 [2024-10-04 08:26:14.041463] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:21.670 [2024-10-04 08:26:14.041516] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1008794 ] 00:08:21.670 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.670 [2024-10-04 08:26:14.104281] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.670 [2024-10-04 08:26:14.139288] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val= 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val= 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val= 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val=0x1 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val= 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val= 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val=decompress 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@24 -- # accel_opc=decompress 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val='111250 bytes' 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val= 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val=software 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@23 -- # accel_module=software 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val=32 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val=32 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val=1 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val=Yes 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val= 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:21.670 08:26:14 -- accel/accel.sh@21 -- # val= 00:08:21.670 08:26:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # IFS=: 00:08:21.670 08:26:14 -- accel/accel.sh@20 -- # read -r var val 00:08:23.049 08:26:15 -- accel/accel.sh@21 -- # val= 00:08:23.049 08:26:15 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # IFS=: 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # read -r var val 00:08:23.049 08:26:15 -- accel/accel.sh@21 -- # val= 00:08:23.049 08:26:15 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # IFS=: 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # read -r var val 00:08:23.049 08:26:15 -- accel/accel.sh@21 -- # val= 00:08:23.049 08:26:15 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # IFS=: 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # read -r var val 00:08:23.049 08:26:15 -- accel/accel.sh@21 -- # val= 00:08:23.049 08:26:15 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # IFS=: 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # read -r var val 00:08:23.049 08:26:15 -- accel/accel.sh@21 -- # val= 00:08:23.049 08:26:15 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # IFS=: 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # read -r var val 00:08:23.049 08:26:15 -- accel/accel.sh@21 -- # val= 00:08:23.049 08:26:15 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # IFS=: 00:08:23.049 08:26:15 -- accel/accel.sh@20 -- # read -r var val 00:08:23.049 08:26:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:23.049 08:26:15 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:08:23.049 08:26:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:23.049 00:08:23.049 real 0m2.573s 00:08:23.049 user 0m2.321s 00:08:23.049 sys 0m0.250s 00:08:23.049 08:26:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.049 08:26:15 -- common/autotest_common.sh@10 -- # set +x 00:08:23.049 ************************************ 00:08:23.049 END TEST accel_decmop_full 00:08:23.049 ************************************ 00:08:23.049 08:26:15 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:23.049 08:26:15 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:08:23.049 08:26:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:23.049 08:26:15 -- common/autotest_common.sh@10 -- # set +x 00:08:23.049 ************************************ 00:08:23.049 START TEST accel_decomp_mcore 00:08:23.049 ************************************ 00:08:23.049 08:26:15 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:23.049 08:26:15 -- accel/accel.sh@16 -- # local accel_opc 00:08:23.049 08:26:15 -- accel/accel.sh@17 -- # local accel_module 00:08:23.049 08:26:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:23.049 08:26:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:23.049 08:26:15 -- accel/accel.sh@12 -- # build_accel_config 00:08:23.049 08:26:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:23.049 08:26:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.049 08:26:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.049 08:26:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:23.049 08:26:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:23.049 08:26:15 -- accel/accel.sh@41 -- # local IFS=, 00:08:23.049 08:26:15 -- accel/accel.sh@42 -- # jq -r . 00:08:23.049 [2024-10-04 08:26:15.370570] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:23.049 [2024-10-04 08:26:15.370663] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1009083 ] 00:08:23.049 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.049 [2024-10-04 08:26:15.438436] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:23.049 [2024-10-04 08:26:15.475360] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:23.049 [2024-10-04 08:26:15.475458] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:23.049 [2024-10-04 08:26:15.475519] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:23.049 [2024-10-04 08:26:15.475521] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.002 08:26:16 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:24.002 00:08:24.002 SPDK Configuration: 00:08:24.002 Core mask: 0xf 00:08:24.002 00:08:24.002 Accel Perf Configuration: 00:08:24.002 Workload Type: decompress 00:08:24.002 Transfer size: 4096 bytes 00:08:24.002 Vector count 1 00:08:24.002 Module: software 00:08:24.002 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:24.002 Queue depth: 32 00:08:24.002 Allocate depth: 32 00:08:24.002 # threads/core: 1 00:08:24.002 Run time: 1 seconds 00:08:24.002 Verify: Yes 00:08:24.002 00:08:24.002 Running for 1 seconds... 00:08:24.002 00:08:24.002 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:24.002 ------------------------------------------------------------------------------------ 00:08:24.002 0,0 78208/s 144 MiB/s 0 0 00:08:24.002 3,0 78464/s 144 MiB/s 0 0 00:08:24.002 2,0 78208/s 144 MiB/s 0 0 00:08:24.002 1,0 77888/s 143 MiB/s 0 0 00:08:24.002 ==================================================================================== 00:08:24.002 Total 312768/s 1221 MiB/s 0 0' 00:08:24.002 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.002 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.002 08:26:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:24.002 08:26:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:24.002 08:26:16 -- accel/accel.sh@12 -- # build_accel_config 00:08:24.002 08:26:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:24.002 08:26:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.002 08:26:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.002 08:26:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:24.002 08:26:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:24.002 08:26:16 -- accel/accel.sh@41 -- # local IFS=, 00:08:24.002 08:26:16 -- accel/accel.sh@42 -- # jq -r . 00:08:24.002 [2024-10-04 08:26:16.664116] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:24.002 [2024-10-04 08:26:16.664213] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1009352 ] 00:08:24.276 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.276 [2024-10-04 08:26:16.733837] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:24.276 [2024-10-04 08:26:16.772057] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:24.276 [2024-10-04 08:26:16.772156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:24.276 [2024-10-04 08:26:16.772244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:24.276 [2024-10-04 08:26:16.772246] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val= 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val= 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val= 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val=0xf 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val= 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val= 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val=decompress 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@24 -- # accel_opc=decompress 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val= 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val=software 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@23 -- # accel_module=software 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val=32 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val=32 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val=1 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val=Yes 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val= 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:24.276 08:26:16 -- accel/accel.sh@21 -- # val= 00:08:24.276 08:26:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # IFS=: 00:08:24.276 08:26:16 -- accel/accel.sh@20 -- # read -r var val 00:08:25.654 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.654 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.654 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.654 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.655 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.655 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.655 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.655 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.655 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.655 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.655 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@21 -- # val= 00:08:25.655 08:26:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # IFS=: 00:08:25.655 08:26:17 -- accel/accel.sh@20 -- # read -r var val 00:08:25.655 08:26:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:25.655 08:26:17 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:08:25.655 08:26:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:25.655 00:08:25.655 real 0m2.598s 00:08:25.655 user 0m8.983s 00:08:25.655 sys 0m0.273s 00:08:25.655 08:26:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:25.655 08:26:17 -- common/autotest_common.sh@10 -- # set +x 00:08:25.655 ************************************ 00:08:25.655 END TEST accel_decomp_mcore 00:08:25.655 ************************************ 00:08:25.655 08:26:17 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:25.655 08:26:17 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:08:25.655 08:26:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:25.655 08:26:17 -- common/autotest_common.sh@10 -- # set +x 00:08:25.655 ************************************ 00:08:25.655 START TEST accel_decomp_full_mcore 00:08:25.655 ************************************ 00:08:25.655 08:26:17 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:25.655 08:26:17 -- accel/accel.sh@16 -- # local accel_opc 00:08:25.655 08:26:17 -- accel/accel.sh@17 -- # local accel_module 00:08:25.655 08:26:18 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:25.655 08:26:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:25.655 08:26:18 -- accel/accel.sh@12 -- # build_accel_config 00:08:25.655 08:26:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:25.655 08:26:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:25.655 08:26:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:25.655 08:26:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:25.655 08:26:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:25.655 08:26:18 -- accel/accel.sh@41 -- # local IFS=, 00:08:25.655 08:26:18 -- accel/accel.sh@42 -- # jq -r . 00:08:25.655 [2024-10-04 08:26:18.018902] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:25.655 [2024-10-04 08:26:18.018992] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1009639 ] 00:08:25.655 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.655 [2024-10-04 08:26:18.089073] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:25.655 [2024-10-04 08:26:18.126474] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:25.655 [2024-10-04 08:26:18.126571] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:25.655 [2024-10-04 08:26:18.126655] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:25.655 [2024-10-04 08:26:18.126657] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.031 08:26:19 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:27.031 00:08:27.031 SPDK Configuration: 00:08:27.031 Core mask: 0xf 00:08:27.031 00:08:27.031 Accel Perf Configuration: 00:08:27.031 Workload Type: decompress 00:08:27.031 Transfer size: 111250 bytes 00:08:27.031 Vector count 1 00:08:27.031 Module: software 00:08:27.031 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:27.031 Queue depth: 32 00:08:27.031 Allocate depth: 32 00:08:27.031 # threads/core: 1 00:08:27.031 Run time: 1 seconds 00:08:27.031 Verify: Yes 00:08:27.031 00:08:27.031 Running for 1 seconds... 00:08:27.031 00:08:27.031 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:27.031 ------------------------------------------------------------------------------------ 00:08:27.031 0,0 5792/s 239 MiB/s 0 0 00:08:27.031 3,0 5824/s 240 MiB/s 0 0 00:08:27.031 2,0 5824/s 240 MiB/s 0 0 00:08:27.031 1,0 5824/s 240 MiB/s 0 0 00:08:27.031 ==================================================================================== 00:08:27.031 Total 23264/s 2468 MiB/s 0 0' 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:27.031 08:26:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:27.031 08:26:19 -- accel/accel.sh@12 -- # build_accel_config 00:08:27.031 08:26:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:27.031 08:26:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:27.031 08:26:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:27.031 08:26:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:27.031 08:26:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:27.031 08:26:19 -- accel/accel.sh@41 -- # local IFS=, 00:08:27.031 08:26:19 -- accel/accel.sh@42 -- # jq -r . 00:08:27.031 [2024-10-04 08:26:19.323136] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:27.031 [2024-10-04 08:26:19.323234] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1009915 ] 00:08:27.031 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.031 [2024-10-04 08:26:19.391518] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:27.031 [2024-10-04 08:26:19.427934] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:27.031 [2024-10-04 08:26:19.428031] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:27.031 [2024-10-04 08:26:19.428114] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:27.031 [2024-10-04 08:26:19.428116] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val= 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val= 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val= 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val=0xf 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val= 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val= 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val=decompress 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@24 -- # accel_opc=decompress 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val='111250 bytes' 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val= 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val=software 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@23 -- # accel_module=software 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val=32 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val=32 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val=1 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val=Yes 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val= 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.031 08:26:19 -- accel/accel.sh@21 -- # val= 00:08:27.031 08:26:19 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # IFS=: 00:08:27.031 08:26:19 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@21 -- # val= 00:08:27.966 08:26:20 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # IFS=: 00:08:27.966 08:26:20 -- accel/accel.sh@20 -- # read -r var val 00:08:27.966 08:26:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:27.966 08:26:20 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:08:27.966 08:26:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:27.966 00:08:27.966 real 0m2.615s 00:08:27.966 user 0m9.038s 00:08:27.966 sys 0m0.273s 00:08:27.966 08:26:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.966 08:26:20 -- common/autotest_common.sh@10 -- # set +x 00:08:27.966 ************************************ 00:08:27.966 END TEST accel_decomp_full_mcore 00:08:27.966 ************************************ 00:08:28.225 08:26:20 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:28.225 08:26:20 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:08:28.225 08:26:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:28.225 08:26:20 -- common/autotest_common.sh@10 -- # set +x 00:08:28.225 ************************************ 00:08:28.225 START TEST accel_decomp_mthread 00:08:28.225 ************************************ 00:08:28.225 08:26:20 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:28.225 08:26:20 -- accel/accel.sh@16 -- # local accel_opc 00:08:28.225 08:26:20 -- accel/accel.sh@17 -- # local accel_module 00:08:28.225 08:26:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:28.225 08:26:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:28.225 08:26:20 -- accel/accel.sh@12 -- # build_accel_config 00:08:28.225 08:26:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:28.225 08:26:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:28.225 08:26:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:28.225 08:26:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:28.225 08:26:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:28.225 08:26:20 -- accel/accel.sh@41 -- # local IFS=, 00:08:28.225 08:26:20 -- accel/accel.sh@42 -- # jq -r . 00:08:28.225 [2024-10-04 08:26:20.686776] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:28.225 [2024-10-04 08:26:20.686884] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1010136 ] 00:08:28.225 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.225 [2024-10-04 08:26:20.754994] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.225 [2024-10-04 08:26:20.790025] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.601 08:26:21 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:29.601 00:08:29.601 SPDK Configuration: 00:08:29.601 Core mask: 0x1 00:08:29.601 00:08:29.601 Accel Perf Configuration: 00:08:29.601 Workload Type: decompress 00:08:29.601 Transfer size: 4096 bytes 00:08:29.601 Vector count 1 00:08:29.601 Module: software 00:08:29.601 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:29.601 Queue depth: 32 00:08:29.601 Allocate depth: 32 00:08:29.601 # threads/core: 2 00:08:29.601 Run time: 1 seconds 00:08:29.601 Verify: Yes 00:08:29.601 00:08:29.601 Running for 1 seconds... 00:08:29.601 00:08:29.601 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:29.601 ------------------------------------------------------------------------------------ 00:08:29.601 0,1 48000/s 88 MiB/s 0 0 00:08:29.601 0,0 47840/s 88 MiB/s 0 0 00:08:29.601 ==================================================================================== 00:08:29.601 Total 95840/s 374 MiB/s 0 0' 00:08:29.601 08:26:21 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:21 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:29.601 08:26:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:29.601 08:26:21 -- accel/accel.sh@12 -- # build_accel_config 00:08:29.601 08:26:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:29.601 08:26:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:29.601 08:26:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:29.601 08:26:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:29.601 08:26:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:29.601 08:26:21 -- accel/accel.sh@41 -- # local IFS=, 00:08:29.601 08:26:21 -- accel/accel.sh@42 -- # jq -r . 00:08:29.601 [2024-10-04 08:26:21.976265] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:29.601 [2024-10-04 08:26:21.976391] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1010282 ] 00:08:29.601 EAL: No free 2048 kB hugepages reported on node 1 00:08:29.601 [2024-10-04 08:26:22.046737] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.601 [2024-10-04 08:26:22.082175] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val= 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val= 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val= 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val=0x1 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val= 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val= 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val=decompress 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@24 -- # accel_opc=decompress 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val= 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val=software 00:08:29.601 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.601 08:26:22 -- accel/accel.sh@23 -- # accel_module=software 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.601 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.601 08:26:22 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:29.602 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.602 08:26:22 -- accel/accel.sh@21 -- # val=32 00:08:29.602 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.602 08:26:22 -- accel/accel.sh@21 -- # val=32 00:08:29.602 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.602 08:26:22 -- accel/accel.sh@21 -- # val=2 00:08:29.602 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.602 08:26:22 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:29.602 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.602 08:26:22 -- accel/accel.sh@21 -- # val=Yes 00:08:29.602 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.602 08:26:22 -- accel/accel.sh@21 -- # val= 00:08:29.602 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:29.602 08:26:22 -- accel/accel.sh@21 -- # val= 00:08:29.602 08:26:22 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # IFS=: 00:08:29.602 08:26:22 -- accel/accel.sh@20 -- # read -r var val 00:08:30.980 08:26:23 -- accel/accel.sh@21 -- # val= 00:08:30.980 08:26:23 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.980 08:26:23 -- accel/accel.sh@20 -- # IFS=: 00:08:30.980 08:26:23 -- accel/accel.sh@20 -- # read -r var val 00:08:30.980 08:26:23 -- accel/accel.sh@21 -- # val= 00:08:30.980 08:26:23 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.980 08:26:23 -- accel/accel.sh@20 -- # IFS=: 00:08:30.980 08:26:23 -- accel/accel.sh@20 -- # read -r var val 00:08:30.980 08:26:23 -- accel/accel.sh@21 -- # val= 00:08:30.980 08:26:23 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.980 08:26:23 -- accel/accel.sh@20 -- # IFS=: 00:08:30.980 08:26:23 -- accel/accel.sh@20 -- # read -r var val 00:08:30.980 08:26:23 -- accel/accel.sh@21 -- # val= 00:08:30.980 08:26:23 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.980 08:26:23 -- accel/accel.sh@20 -- # IFS=: 00:08:30.980 08:26:23 -- accel/accel.sh@20 -- # read -r var val 00:08:30.981 08:26:23 -- accel/accel.sh@21 -- # val= 00:08:30.981 08:26:23 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.981 08:26:23 -- accel/accel.sh@20 -- # IFS=: 00:08:30.981 08:26:23 -- accel/accel.sh@20 -- # read -r var val 00:08:30.981 08:26:23 -- accel/accel.sh@21 -- # val= 00:08:30.981 08:26:23 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.981 08:26:23 -- accel/accel.sh@20 -- # IFS=: 00:08:30.981 08:26:23 -- accel/accel.sh@20 -- # read -r var val 00:08:30.981 08:26:23 -- accel/accel.sh@21 -- # val= 00:08:30.981 08:26:23 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.981 08:26:23 -- accel/accel.sh@20 -- # IFS=: 00:08:30.981 08:26:23 -- accel/accel.sh@20 -- # read -r var val 00:08:30.981 08:26:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:30.981 08:26:23 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:08:30.981 08:26:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:30.981 00:08:30.981 real 0m2.589s 00:08:30.981 user 0m2.342s 00:08:30.981 sys 0m0.257s 00:08:30.981 08:26:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.981 08:26:23 -- common/autotest_common.sh@10 -- # set +x 00:08:30.981 ************************************ 00:08:30.981 END TEST accel_decomp_mthread 00:08:30.981 ************************************ 00:08:30.981 08:26:23 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:30.981 08:26:23 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:08:30.981 08:26:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:30.981 08:26:23 -- common/autotest_common.sh@10 -- # set +x 00:08:30.981 ************************************ 00:08:30.981 START TEST accel_deomp_full_mthread 00:08:30.981 ************************************ 00:08:30.981 08:26:23 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:30.981 08:26:23 -- accel/accel.sh@16 -- # local accel_opc 00:08:30.981 08:26:23 -- accel/accel.sh@17 -- # local accel_module 00:08:30.981 08:26:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:30.981 08:26:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:30.981 08:26:23 -- accel/accel.sh@12 -- # build_accel_config 00:08:30.981 08:26:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:30.981 08:26:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:30.981 08:26:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:30.981 08:26:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:30.981 08:26:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:30.981 08:26:23 -- accel/accel.sh@41 -- # local IFS=, 00:08:30.981 08:26:23 -- accel/accel.sh@42 -- # jq -r . 00:08:30.981 [2024-10-04 08:26:23.324854] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:30.981 [2024-10-04 08:26:23.324947] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1010508 ] 00:08:30.981 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.981 [2024-10-04 08:26:23.395183] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.981 [2024-10-04 08:26:23.430531] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.360 08:26:24 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:32.360 00:08:32.360 SPDK Configuration: 00:08:32.360 Core mask: 0x1 00:08:32.360 00:08:32.360 Accel Perf Configuration: 00:08:32.360 Workload Type: decompress 00:08:32.360 Transfer size: 111250 bytes 00:08:32.360 Vector count 1 00:08:32.360 Module: software 00:08:32.360 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:32.360 Queue depth: 32 00:08:32.360 Allocate depth: 32 00:08:32.360 # threads/core: 2 00:08:32.360 Run time: 1 seconds 00:08:32.360 Verify: Yes 00:08:32.360 00:08:32.360 Running for 1 seconds... 00:08:32.360 00:08:32.360 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:32.360 ------------------------------------------------------------------------------------ 00:08:32.360 0,1 3008/s 124 MiB/s 0 0 00:08:32.360 0,0 2944/s 121 MiB/s 0 0 00:08:32.360 ==================================================================================== 00:08:32.360 Total 5952/s 631 MiB/s 0 0' 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:32.360 08:26:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:32.360 08:26:24 -- accel/accel.sh@12 -- # build_accel_config 00:08:32.360 08:26:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:32.360 08:26:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:32.360 08:26:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:32.360 08:26:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:32.360 08:26:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:32.360 08:26:24 -- accel/accel.sh@41 -- # local IFS=, 00:08:32.360 08:26:24 -- accel/accel.sh@42 -- # jq -r . 00:08:32.360 [2024-10-04 08:26:24.638656] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:32.360 [2024-10-04 08:26:24.638748] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1010782 ] 00:08:32.360 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.360 [2024-10-04 08:26:24.707018] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.360 [2024-10-04 08:26:24.741245] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val= 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val= 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val= 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val=0x1 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val= 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val= 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val=decompress 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@24 -- # accel_opc=decompress 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val='111250 bytes' 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val= 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val=software 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@23 -- # accel_module=software 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val=32 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val=32 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val=2 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val=Yes 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val= 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:32.360 08:26:24 -- accel/accel.sh@21 -- # val= 00:08:32.360 08:26:24 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # IFS=: 00:08:32.360 08:26:24 -- accel/accel.sh@20 -- # read -r var val 00:08:33.297 08:26:25 -- accel/accel.sh@21 -- # val= 00:08:33.297 08:26:25 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # IFS=: 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # read -r var val 00:08:33.297 08:26:25 -- accel/accel.sh@21 -- # val= 00:08:33.297 08:26:25 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # IFS=: 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # read -r var val 00:08:33.297 08:26:25 -- accel/accel.sh@21 -- # val= 00:08:33.297 08:26:25 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # IFS=: 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # read -r var val 00:08:33.297 08:26:25 -- accel/accel.sh@21 -- # val= 00:08:33.297 08:26:25 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # IFS=: 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # read -r var val 00:08:33.297 08:26:25 -- accel/accel.sh@21 -- # val= 00:08:33.297 08:26:25 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # IFS=: 00:08:33.297 08:26:25 -- accel/accel.sh@20 -- # read -r var val 00:08:33.297 08:26:25 -- accel/accel.sh@21 -- # val= 00:08:33.298 08:26:25 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.298 08:26:25 -- accel/accel.sh@20 -- # IFS=: 00:08:33.298 08:26:25 -- accel/accel.sh@20 -- # read -r var val 00:08:33.298 08:26:25 -- accel/accel.sh@21 -- # val= 00:08:33.298 08:26:25 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.298 08:26:25 -- accel/accel.sh@20 -- # IFS=: 00:08:33.298 08:26:25 -- accel/accel.sh@20 -- # read -r var val 00:08:33.298 08:26:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:33.298 08:26:25 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:08:33.298 08:26:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:33.298 00:08:33.298 real 0m2.626s 00:08:33.298 user 0m2.380s 00:08:33.298 sys 0m0.254s 00:08:33.298 08:26:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.298 08:26:25 -- common/autotest_common.sh@10 -- # set +x 00:08:33.298 ************************************ 00:08:33.298 END TEST accel_deomp_full_mthread 00:08:33.298 ************************************ 00:08:33.298 08:26:25 -- accel/accel.sh@116 -- # [[ n == y ]] 00:08:33.298 08:26:25 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:33.298 08:26:25 -- accel/accel.sh@129 -- # build_accel_config 00:08:33.298 08:26:25 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:08:33.298 08:26:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:33.298 08:26:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:33.298 08:26:25 -- common/autotest_common.sh@10 -- # set +x 00:08:33.298 08:26:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:33.298 08:26:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:33.298 08:26:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:33.298 08:26:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:33.298 08:26:25 -- accel/accel.sh@41 -- # local IFS=, 00:08:33.298 08:26:25 -- accel/accel.sh@42 -- # jq -r . 00:08:33.557 ************************************ 00:08:33.557 START TEST accel_dif_functional_tests 00:08:33.557 ************************************ 00:08:33.557 08:26:25 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:33.557 [2024-10-04 08:26:26.002088] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:33.557 [2024-10-04 08:26:26.002208] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1011064 ] 00:08:33.557 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.557 [2024-10-04 08:26:26.070873] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:33.557 [2024-10-04 08:26:26.107819] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:33.557 [2024-10-04 08:26:26.107912] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:33.557 [2024-10-04 08:26:26.107914] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.557 00:08:33.557 00:08:33.557 CUnit - A unit testing framework for C - Version 2.1-3 00:08:33.557 http://cunit.sourceforge.net/ 00:08:33.557 00:08:33.557 00:08:33.557 Suite: accel_dif 00:08:33.557 Test: verify: DIF generated, GUARD check ...passed 00:08:33.557 Test: verify: DIF generated, APPTAG check ...passed 00:08:33.557 Test: verify: DIF generated, REFTAG check ...passed 00:08:33.557 Test: verify: DIF not generated, GUARD check ...[2024-10-04 08:26:26.171821] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:33.557 [2024-10-04 08:26:26.171876] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:33.557 passed 00:08:33.557 Test: verify: DIF not generated, APPTAG check ...[2024-10-04 08:26:26.171911] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:33.557 [2024-10-04 08:26:26.171930] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:33.557 passed 00:08:33.557 Test: verify: DIF not generated, REFTAG check ...[2024-10-04 08:26:26.171953] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:33.557 [2024-10-04 08:26:26.171973] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:33.557 passed 00:08:33.557 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:33.557 Test: verify: APPTAG incorrect, APPTAG check ...[2024-10-04 08:26:26.172020] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:33.557 passed 00:08:33.557 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:33.557 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:33.557 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:33.557 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-10-04 08:26:26.172121] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:33.557 passed 00:08:33.557 Test: generate copy: DIF generated, GUARD check ...passed 00:08:33.557 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:33.557 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:33.557 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:33.557 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:33.557 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:33.557 Test: generate copy: iovecs-len validate ...[2024-10-04 08:26:26.172308] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:33.557 passed 00:08:33.557 Test: generate copy: buffer alignment validate ...passed 00:08:33.557 00:08:33.557 Run Summary: Type Total Ran Passed Failed Inactive 00:08:33.557 suites 1 1 n/a 0 0 00:08:33.557 tests 20 20 20 0 0 00:08:33.557 asserts 204 204 204 0 n/a 00:08:33.557 00:08:33.557 Elapsed time = 0.002 seconds 00:08:33.816 00:08:33.816 real 0m0.343s 00:08:33.816 user 0m0.535s 00:08:33.816 sys 0m0.152s 00:08:33.816 08:26:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.816 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:33.816 ************************************ 00:08:33.816 END TEST accel_dif_functional_tests 00:08:33.816 ************************************ 00:08:33.816 00:08:33.816 real 0m54.627s 00:08:33.816 user 1m2.497s 00:08:33.816 sys 0m6.674s 00:08:33.816 08:26:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.816 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:33.816 ************************************ 00:08:33.816 END TEST accel 00:08:33.816 ************************************ 00:08:33.816 08:26:26 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:33.816 08:26:26 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:33.816 08:26:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:33.816 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:33.816 ************************************ 00:08:33.816 START TEST accel_rpc 00:08:33.816 ************************************ 00:08:33.816 08:26:26 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:34.073 * Looking for test storage... 00:08:34.073 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1011138 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@15 -- # waitforlisten 1011138 00:08:34.073 08:26:26 -- common/autotest_common.sh@819 -- # '[' -z 1011138 ']' 00:08:34.073 08:26:26 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:34.073 08:26:26 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:34.073 08:26:26 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:34.073 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:34.073 08:26:26 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:34.073 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:34.073 [2024-10-04 08:26:26.541290] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:34.073 [2024-10-04 08:26:26.541367] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1011138 ] 00:08:34.073 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.073 [2024-10-04 08:26:26.607554] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.073 [2024-10-04 08:26:26.643838] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:34.073 [2024-10-04 08:26:26.643955] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.073 08:26:26 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:34.073 08:26:26 -- common/autotest_common.sh@852 -- # return 0 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:34.073 08:26:26 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:34.073 08:26:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:34.073 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:34.073 ************************************ 00:08:34.073 START TEST accel_assign_opcode 00:08:34.073 ************************************ 00:08:34.073 08:26:26 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:08:34.073 08:26:26 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:34.074 08:26:26 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:34.074 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:34.074 [2024-10-04 08:26:26.724478] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:34.074 08:26:26 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:34.074 08:26:26 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:34.074 08:26:26 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:34.074 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:34.074 [2024-10-04 08:26:26.732492] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:34.074 08:26:26 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:34.074 08:26:26 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:34.074 08:26:26 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:34.074 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:34.332 08:26:26 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:34.332 08:26:26 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:34.332 08:26:26 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:34.332 08:26:26 -- accel/accel_rpc.sh@42 -- # grep software 00:08:34.332 08:26:26 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:34.332 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:34.332 08:26:26 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:34.332 software 00:08:34.332 00:08:34.332 real 0m0.218s 00:08:34.332 user 0m0.045s 00:08:34.332 sys 0m0.011s 00:08:34.332 08:26:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.332 08:26:26 -- common/autotest_common.sh@10 -- # set +x 00:08:34.332 ************************************ 00:08:34.332 END TEST accel_assign_opcode 00:08:34.332 ************************************ 00:08:34.332 08:26:26 -- accel/accel_rpc.sh@55 -- # killprocess 1011138 00:08:34.332 08:26:26 -- common/autotest_common.sh@926 -- # '[' -z 1011138 ']' 00:08:34.332 08:26:26 -- common/autotest_common.sh@930 -- # kill -0 1011138 00:08:34.332 08:26:26 -- common/autotest_common.sh@931 -- # uname 00:08:34.332 08:26:26 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:34.332 08:26:26 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1011138 00:08:34.591 08:26:27 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:34.591 08:26:27 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:34.591 08:26:27 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1011138' 00:08:34.591 killing process with pid 1011138 00:08:34.591 08:26:27 -- common/autotest_common.sh@945 -- # kill 1011138 00:08:34.591 08:26:27 -- common/autotest_common.sh@950 -- # wait 1011138 00:08:34.850 00:08:34.850 real 0m0.906s 00:08:34.850 user 0m0.833s 00:08:34.850 sys 0m0.425s 00:08:34.850 08:26:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.850 08:26:27 -- common/autotest_common.sh@10 -- # set +x 00:08:34.850 ************************************ 00:08:34.850 END TEST accel_rpc 00:08:34.850 ************************************ 00:08:34.850 08:26:27 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:08:34.850 08:26:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:34.850 08:26:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:34.850 08:26:27 -- common/autotest_common.sh@10 -- # set +x 00:08:34.850 ************************************ 00:08:34.850 START TEST app_cmdline 00:08:34.850 ************************************ 00:08:34.851 08:26:27 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:08:34.851 * Looking for test storage... 00:08:34.851 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:34.851 08:26:27 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:34.851 08:26:27 -- app/cmdline.sh@17 -- # spdk_tgt_pid=1011460 00:08:34.851 08:26:27 -- app/cmdline.sh@18 -- # waitforlisten 1011460 00:08:34.851 08:26:27 -- common/autotest_common.sh@819 -- # '[' -z 1011460 ']' 00:08:34.851 08:26:27 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:34.851 08:26:27 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:34.851 08:26:27 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:34.851 08:26:27 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:34.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:34.851 08:26:27 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:34.851 08:26:27 -- common/autotest_common.sh@10 -- # set +x 00:08:34.851 [2024-10-04 08:26:27.485784] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:34.851 [2024-10-04 08:26:27.485853] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1011460 ] 00:08:34.851 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.110 [2024-10-04 08:26:27.549608] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.110 [2024-10-04 08:26:27.587789] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:35.110 [2024-10-04 08:26:27.587899] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.677 08:26:28 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:35.677 08:26:28 -- common/autotest_common.sh@852 -- # return 0 00:08:35.677 08:26:28 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:35.936 { 00:08:35.936 "version": "SPDK v24.01.1-pre git sha1 726a04d70", 00:08:35.936 "fields": { 00:08:35.936 "major": 24, 00:08:35.936 "minor": 1, 00:08:35.936 "patch": 1, 00:08:35.936 "suffix": "-pre", 00:08:35.936 "commit": "726a04d70" 00:08:35.936 } 00:08:35.936 } 00:08:35.936 08:26:28 -- app/cmdline.sh@22 -- # expected_methods=() 00:08:35.936 08:26:28 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:35.936 08:26:28 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:35.936 08:26:28 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:35.936 08:26:28 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:35.936 08:26:28 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:35.936 08:26:28 -- common/autotest_common.sh@10 -- # set +x 00:08:35.936 08:26:28 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:35.936 08:26:28 -- app/cmdline.sh@26 -- # sort 00:08:35.936 08:26:28 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:35.936 08:26:28 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:35.936 08:26:28 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:35.936 08:26:28 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:35.936 08:26:28 -- common/autotest_common.sh@640 -- # local es=0 00:08:35.936 08:26:28 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:35.936 08:26:28 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:35.936 08:26:28 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:35.936 08:26:28 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:35.936 08:26:28 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:35.936 08:26:28 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:35.936 08:26:28 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:35.936 08:26:28 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:35.936 08:26:28 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:08:35.936 08:26:28 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:36.195 request: 00:08:36.195 { 00:08:36.195 "method": "env_dpdk_get_mem_stats", 00:08:36.195 "req_id": 1 00:08:36.195 } 00:08:36.195 Got JSON-RPC error response 00:08:36.195 response: 00:08:36.195 { 00:08:36.195 "code": -32601, 00:08:36.195 "message": "Method not found" 00:08:36.195 } 00:08:36.195 08:26:28 -- common/autotest_common.sh@643 -- # es=1 00:08:36.195 08:26:28 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:36.195 08:26:28 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:36.195 08:26:28 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:36.195 08:26:28 -- app/cmdline.sh@1 -- # killprocess 1011460 00:08:36.195 08:26:28 -- common/autotest_common.sh@926 -- # '[' -z 1011460 ']' 00:08:36.195 08:26:28 -- common/autotest_common.sh@930 -- # kill -0 1011460 00:08:36.195 08:26:28 -- common/autotest_common.sh@931 -- # uname 00:08:36.195 08:26:28 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:36.195 08:26:28 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1011460 00:08:36.195 08:26:28 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:36.195 08:26:28 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:36.195 08:26:28 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1011460' 00:08:36.195 killing process with pid 1011460 00:08:36.195 08:26:28 -- common/autotest_common.sh@945 -- # kill 1011460 00:08:36.196 08:26:28 -- common/autotest_common.sh@950 -- # wait 1011460 00:08:36.455 00:08:36.455 real 0m1.662s 00:08:36.455 user 0m1.962s 00:08:36.455 sys 0m0.453s 00:08:36.455 08:26:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.455 08:26:29 -- common/autotest_common.sh@10 -- # set +x 00:08:36.455 ************************************ 00:08:36.455 END TEST app_cmdline 00:08:36.455 ************************************ 00:08:36.455 08:26:29 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:08:36.455 08:26:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:36.455 08:26:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:36.455 08:26:29 -- common/autotest_common.sh@10 -- # set +x 00:08:36.455 ************************************ 00:08:36.455 START TEST version 00:08:36.455 ************************************ 00:08:36.455 08:26:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:08:36.715 * Looking for test storage... 00:08:36.715 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:36.715 08:26:29 -- app/version.sh@17 -- # get_header_version major 00:08:36.715 08:26:29 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:36.715 08:26:29 -- app/version.sh@14 -- # cut -f2 00:08:36.715 08:26:29 -- app/version.sh@14 -- # tr -d '"' 00:08:36.715 08:26:29 -- app/version.sh@17 -- # major=24 00:08:36.715 08:26:29 -- app/version.sh@18 -- # get_header_version minor 00:08:36.715 08:26:29 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:36.715 08:26:29 -- app/version.sh@14 -- # cut -f2 00:08:36.715 08:26:29 -- app/version.sh@14 -- # tr -d '"' 00:08:36.715 08:26:29 -- app/version.sh@18 -- # minor=1 00:08:36.715 08:26:29 -- app/version.sh@19 -- # get_header_version patch 00:08:36.715 08:26:29 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:36.715 08:26:29 -- app/version.sh@14 -- # cut -f2 00:08:36.715 08:26:29 -- app/version.sh@14 -- # tr -d '"' 00:08:36.715 08:26:29 -- app/version.sh@19 -- # patch=1 00:08:36.715 08:26:29 -- app/version.sh@20 -- # get_header_version suffix 00:08:36.715 08:26:29 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:36.715 08:26:29 -- app/version.sh@14 -- # cut -f2 00:08:36.715 08:26:29 -- app/version.sh@14 -- # tr -d '"' 00:08:36.715 08:26:29 -- app/version.sh@20 -- # suffix=-pre 00:08:36.715 08:26:29 -- app/version.sh@22 -- # version=24.1 00:08:36.715 08:26:29 -- app/version.sh@25 -- # (( patch != 0 )) 00:08:36.715 08:26:29 -- app/version.sh@25 -- # version=24.1.1 00:08:36.715 08:26:29 -- app/version.sh@28 -- # version=24.1.1rc0 00:08:36.715 08:26:29 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:36.715 08:26:29 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:36.715 08:26:29 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:08:36.715 08:26:29 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:08:36.715 00:08:36.715 real 0m0.170s 00:08:36.715 user 0m0.077s 00:08:36.715 sys 0m0.139s 00:08:36.715 08:26:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.715 08:26:29 -- common/autotest_common.sh@10 -- # set +x 00:08:36.715 ************************************ 00:08:36.715 END TEST version 00:08:36.715 ************************************ 00:08:36.715 08:26:29 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@204 -- # uname -s 00:08:36.715 08:26:29 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:08:36.715 08:26:29 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:08:36.715 08:26:29 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:08:36.715 08:26:29 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@268 -- # timing_exit lib 00:08:36.715 08:26:29 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:36.715 08:26:29 -- common/autotest_common.sh@10 -- # set +x 00:08:36.715 08:26:29 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:08:36.715 08:26:29 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:08:36.715 08:26:29 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:08:36.715 08:26:29 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:08:36.715 08:26:29 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:08:36.715 08:26:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:36.715 08:26:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:36.715 08:26:29 -- common/autotest_common.sh@10 -- # set +x 00:08:36.715 ************************************ 00:08:36.715 START TEST llvm_fuzz 00:08:36.715 ************************************ 00:08:36.715 08:26:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:08:36.976 * Looking for test storage... 00:08:36.976 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:08:36.976 08:26:29 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:08:36.976 08:26:29 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:08:36.976 08:26:29 -- common/autotest_common.sh@538 -- # fuzzers=() 00:08:36.976 08:26:29 -- common/autotest_common.sh@538 -- # local fuzzers 00:08:36.976 08:26:29 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:08:36.976 08:26:29 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:08:36.976 08:26:29 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:08:36.976 08:26:29 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:08:36.976 08:26:29 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:08:36.976 08:26:29 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:08:36.976 08:26:29 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:08:36.976 08:26:29 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:36.976 08:26:29 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:36.976 08:26:29 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:36.976 08:26:29 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:36.976 08:26:29 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:36.976 08:26:29 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:36.976 08:26:29 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:36.976 08:26:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:36.976 08:26:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:36.976 08:26:29 -- common/autotest_common.sh@10 -- # set +x 00:08:36.976 ************************************ 00:08:36.976 START TEST nvmf_fuzz 00:08:36.976 ************************************ 00:08:36.976 08:26:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:36.976 * Looking for test storage... 00:08:36.976 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:36.976 08:26:29 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:36.976 08:26:29 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:36.976 08:26:29 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:36.976 08:26:29 -- common/autotest_common.sh@34 -- # set -e 00:08:36.976 08:26:29 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:36.976 08:26:29 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:36.976 08:26:29 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:36.976 08:26:29 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:36.976 08:26:29 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:36.976 08:26:29 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:36.976 08:26:29 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:36.976 08:26:29 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:36.976 08:26:29 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:36.976 08:26:29 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:36.976 08:26:29 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:36.976 08:26:29 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:36.976 08:26:29 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:36.976 08:26:29 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:36.976 08:26:29 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:36.976 08:26:29 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:36.976 08:26:29 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:36.976 08:26:29 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:36.976 08:26:29 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:36.976 08:26:29 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:36.976 08:26:29 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:36.976 08:26:29 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:36.977 08:26:29 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:36.977 08:26:29 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:36.977 08:26:29 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:36.977 08:26:29 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:36.977 08:26:29 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:36.977 08:26:29 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:36.977 08:26:29 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:36.977 08:26:29 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:36.977 08:26:29 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:36.977 08:26:29 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:36.977 08:26:29 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:36.977 08:26:29 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:36.977 08:26:29 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:36.977 08:26:29 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:36.977 08:26:29 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:36.977 08:26:29 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:36.977 08:26:29 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:36.977 08:26:29 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.977 08:26:29 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:36.977 08:26:29 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:36.977 08:26:29 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:36.977 08:26:29 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:36.977 08:26:29 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:36.977 08:26:29 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:36.977 08:26:29 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:36.977 08:26:29 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:36.977 08:26:29 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:36.977 08:26:29 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:36.977 08:26:29 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:36.977 08:26:29 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:36.977 08:26:29 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:36.977 08:26:29 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:36.977 08:26:29 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:36.977 08:26:29 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:36.977 08:26:29 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:36.977 08:26:29 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:36.977 08:26:29 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:36.977 08:26:29 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:36.977 08:26:29 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:36.977 08:26:29 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:36.977 08:26:29 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:36.977 08:26:29 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:36.977 08:26:29 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.977 08:26:29 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:36.977 08:26:29 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:36.977 08:26:29 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:36.977 08:26:29 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:36.977 08:26:29 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:36.977 08:26:29 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:36.977 08:26:29 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:36.977 08:26:29 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:36.977 08:26:29 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:36.977 08:26:29 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:36.977 08:26:29 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:36.977 08:26:29 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:36.977 08:26:29 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:36.977 08:26:29 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:36.977 08:26:29 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:36.977 08:26:29 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:36.977 08:26:29 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:36.977 08:26:29 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:36.977 08:26:29 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:36.977 08:26:29 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:36.977 08:26:29 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:36.977 08:26:29 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:36.977 08:26:29 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:36.977 08:26:29 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.977 08:26:29 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:36.977 08:26:29 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.977 08:26:29 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:36.977 08:26:29 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:36.977 08:26:29 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:36.977 08:26:29 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:36.977 08:26:29 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:36.977 08:26:29 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:36.977 08:26:29 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:36.977 08:26:29 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:36.977 #define SPDK_CONFIG_H 00:08:36.977 #define SPDK_CONFIG_APPS 1 00:08:36.977 #define SPDK_CONFIG_ARCH native 00:08:36.977 #undef SPDK_CONFIG_ASAN 00:08:36.977 #undef SPDK_CONFIG_AVAHI 00:08:36.977 #undef SPDK_CONFIG_CET 00:08:36.977 #define SPDK_CONFIG_COVERAGE 1 00:08:36.977 #define SPDK_CONFIG_CROSS_PREFIX 00:08:36.977 #undef SPDK_CONFIG_CRYPTO 00:08:36.977 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:36.977 #undef SPDK_CONFIG_CUSTOMOCF 00:08:36.977 #undef SPDK_CONFIG_DAOS 00:08:36.977 #define SPDK_CONFIG_DAOS_DIR 00:08:36.977 #define SPDK_CONFIG_DEBUG 1 00:08:36.977 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:36.977 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.977 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:36.977 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.977 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:36.977 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:36.977 #define SPDK_CONFIG_EXAMPLES 1 00:08:36.977 #undef SPDK_CONFIG_FC 00:08:36.977 #define SPDK_CONFIG_FC_PATH 00:08:36.977 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:36.977 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:36.977 #undef SPDK_CONFIG_FUSE 00:08:36.977 #define SPDK_CONFIG_FUZZER 1 00:08:36.977 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:36.977 #undef SPDK_CONFIG_GOLANG 00:08:36.977 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:36.977 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:36.977 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:36.977 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:36.977 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:36.977 #define SPDK_CONFIG_IDXD 1 00:08:36.977 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:36.977 #undef SPDK_CONFIG_IPSEC_MB 00:08:36.977 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:36.977 #define SPDK_CONFIG_ISAL 1 00:08:36.977 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:36.977 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:36.977 #define SPDK_CONFIG_LIBDIR 00:08:36.977 #undef SPDK_CONFIG_LTO 00:08:36.977 #define SPDK_CONFIG_MAX_LCORES 00:08:36.977 #define SPDK_CONFIG_NVME_CUSE 1 00:08:36.977 #undef SPDK_CONFIG_OCF 00:08:36.977 #define SPDK_CONFIG_OCF_PATH 00:08:36.977 #define SPDK_CONFIG_OPENSSL_PATH 00:08:36.977 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:36.977 #undef SPDK_CONFIG_PGO_USE 00:08:36.977 #define SPDK_CONFIG_PREFIX /usr/local 00:08:36.977 #undef SPDK_CONFIG_RAID5F 00:08:36.977 #undef SPDK_CONFIG_RBD 00:08:36.977 #define SPDK_CONFIG_RDMA 1 00:08:36.977 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:36.977 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:36.977 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:36.977 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:36.977 #undef SPDK_CONFIG_SHARED 00:08:36.977 #undef SPDK_CONFIG_SMA 00:08:36.977 #define SPDK_CONFIG_TESTS 1 00:08:36.977 #undef SPDK_CONFIG_TSAN 00:08:36.977 #define SPDK_CONFIG_UBLK 1 00:08:36.977 #define SPDK_CONFIG_UBSAN 1 00:08:36.977 #undef SPDK_CONFIG_UNIT_TESTS 00:08:36.977 #undef SPDK_CONFIG_URING 00:08:36.977 #define SPDK_CONFIG_URING_PATH 00:08:36.977 #undef SPDK_CONFIG_URING_ZNS 00:08:36.977 #undef SPDK_CONFIG_USDT 00:08:36.977 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:36.977 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:36.977 #define SPDK_CONFIG_VFIO_USER 1 00:08:36.977 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:36.977 #define SPDK_CONFIG_VHOST 1 00:08:36.977 #define SPDK_CONFIG_VIRTIO 1 00:08:36.977 #undef SPDK_CONFIG_VTUNE 00:08:36.977 #define SPDK_CONFIG_VTUNE_DIR 00:08:36.977 #define SPDK_CONFIG_WERROR 1 00:08:36.977 #define SPDK_CONFIG_WPDK_DIR 00:08:36.977 #undef SPDK_CONFIG_XNVME 00:08:36.977 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:36.977 08:26:29 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:36.977 08:26:29 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:36.977 08:26:29 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:36.977 08:26:29 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:36.977 08:26:29 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:36.977 08:26:29 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.978 08:26:29 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.978 08:26:29 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.978 08:26:29 -- paths/export.sh@5 -- # export PATH 00:08:36.978 08:26:29 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.978 08:26:29 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:36.978 08:26:29 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:36.978 08:26:29 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:36.978 08:26:29 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:36.978 08:26:29 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:36.978 08:26:29 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:36.978 08:26:29 -- pm/common@16 -- # TEST_TAG=N/A 00:08:36.978 08:26:29 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:36.978 08:26:29 -- common/autotest_common.sh@52 -- # : 1 00:08:36.978 08:26:29 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:36.978 08:26:29 -- common/autotest_common.sh@56 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:36.978 08:26:29 -- common/autotest_common.sh@58 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:36.978 08:26:29 -- common/autotest_common.sh@60 -- # : 1 00:08:36.978 08:26:29 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:36.978 08:26:29 -- common/autotest_common.sh@62 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:36.978 08:26:29 -- common/autotest_common.sh@64 -- # : 00:08:36.978 08:26:29 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:36.978 08:26:29 -- common/autotest_common.sh@66 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:36.978 08:26:29 -- common/autotest_common.sh@68 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:36.978 08:26:29 -- common/autotest_common.sh@70 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:36.978 08:26:29 -- common/autotest_common.sh@72 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:36.978 08:26:29 -- common/autotest_common.sh@74 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:36.978 08:26:29 -- common/autotest_common.sh@76 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:36.978 08:26:29 -- common/autotest_common.sh@78 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:36.978 08:26:29 -- common/autotest_common.sh@80 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:36.978 08:26:29 -- common/autotest_common.sh@82 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:36.978 08:26:29 -- common/autotest_common.sh@84 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:36.978 08:26:29 -- common/autotest_common.sh@86 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:36.978 08:26:29 -- common/autotest_common.sh@88 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:36.978 08:26:29 -- common/autotest_common.sh@90 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:36.978 08:26:29 -- common/autotest_common.sh@92 -- # : 1 00:08:36.978 08:26:29 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:36.978 08:26:29 -- common/autotest_common.sh@94 -- # : 1 00:08:36.978 08:26:29 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:36.978 08:26:29 -- common/autotest_common.sh@96 -- # : rdma 00:08:36.978 08:26:29 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:36.978 08:26:29 -- common/autotest_common.sh@98 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:36.978 08:26:29 -- common/autotest_common.sh@100 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:36.978 08:26:29 -- common/autotest_common.sh@102 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:36.978 08:26:29 -- common/autotest_common.sh@104 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:36.978 08:26:29 -- common/autotest_common.sh@106 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:36.978 08:26:29 -- common/autotest_common.sh@108 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:36.978 08:26:29 -- common/autotest_common.sh@110 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:36.978 08:26:29 -- common/autotest_common.sh@112 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:36.978 08:26:29 -- common/autotest_common.sh@114 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:36.978 08:26:29 -- common/autotest_common.sh@116 -- # : 1 00:08:36.978 08:26:29 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:36.978 08:26:29 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.978 08:26:29 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:36.978 08:26:29 -- common/autotest_common.sh@120 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:36.978 08:26:29 -- common/autotest_common.sh@122 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:36.978 08:26:29 -- common/autotest_common.sh@124 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:36.978 08:26:29 -- common/autotest_common.sh@126 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:36.978 08:26:29 -- common/autotest_common.sh@128 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:36.978 08:26:29 -- common/autotest_common.sh@130 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:36.978 08:26:29 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:36.978 08:26:29 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:36.978 08:26:29 -- common/autotest_common.sh@134 -- # : true 00:08:36.978 08:26:29 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:36.978 08:26:29 -- common/autotest_common.sh@136 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:36.978 08:26:29 -- common/autotest_common.sh@138 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:36.978 08:26:29 -- common/autotest_common.sh@140 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:36.978 08:26:29 -- common/autotest_common.sh@142 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:36.978 08:26:29 -- common/autotest_common.sh@144 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:36.978 08:26:29 -- common/autotest_common.sh@146 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:36.978 08:26:29 -- common/autotest_common.sh@148 -- # : 00:08:36.978 08:26:29 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:36.978 08:26:29 -- common/autotest_common.sh@150 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:36.978 08:26:29 -- common/autotest_common.sh@152 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:36.978 08:26:29 -- common/autotest_common.sh@154 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:36.978 08:26:29 -- common/autotest_common.sh@156 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:36.978 08:26:29 -- common/autotest_common.sh@158 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:36.978 08:26:29 -- common/autotest_common.sh@160 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:36.978 08:26:29 -- common/autotest_common.sh@163 -- # : 00:08:36.978 08:26:29 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:36.978 08:26:29 -- common/autotest_common.sh@165 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:36.978 08:26:29 -- common/autotest_common.sh@167 -- # : 0 00:08:36.978 08:26:29 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:36.978 08:26:29 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:36.978 08:26:29 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:36.978 08:26:29 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.978 08:26:29 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.978 08:26:29 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.978 08:26:29 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.979 08:26:29 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.979 08:26:29 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.979 08:26:29 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:36.979 08:26:29 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:36.979 08:26:29 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:36.979 08:26:29 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:36.979 08:26:29 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:36.979 08:26:29 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:36.979 08:26:29 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:36.979 08:26:29 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:36.979 08:26:29 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:36.979 08:26:29 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:36.979 08:26:29 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:36.979 08:26:29 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:36.979 08:26:29 -- common/autotest_common.sh@196 -- # cat 00:08:36.979 08:26:29 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:36.979 08:26:29 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:36.979 08:26:29 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:36.979 08:26:29 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:36.979 08:26:29 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:36.979 08:26:29 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:36.979 08:26:29 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:36.979 08:26:29 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.979 08:26:29 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.979 08:26:29 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.979 08:26:29 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.979 08:26:29 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:36.979 08:26:29 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:36.979 08:26:29 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:36.979 08:26:29 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:36.979 08:26:29 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:36.979 08:26:29 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:36.979 08:26:29 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:36.979 08:26:29 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:36.979 08:26:29 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:36.979 08:26:29 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:36.979 08:26:29 -- common/autotest_common.sh@249 -- # valgrind= 00:08:36.979 08:26:29 -- common/autotest_common.sh@255 -- # uname -s 00:08:36.979 08:26:29 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:36.979 08:26:29 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:36.979 08:26:29 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:36.979 08:26:29 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:36.979 08:26:29 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:36.979 08:26:29 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:36.979 08:26:29 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:36.979 08:26:29 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:36.979 08:26:29 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:36.979 08:26:29 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:36.979 08:26:29 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:36.979 08:26:29 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:36.979 08:26:29 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:36.979 08:26:29 -- common/autotest_common.sh@309 -- # [[ -z 1011885 ]] 00:08:36.979 08:26:29 -- common/autotest_common.sh@309 -- # kill -0 1011885 00:08:36.979 08:26:29 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:36.979 08:26:29 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:36.979 08:26:29 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:36.979 08:26:29 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:36.979 08:26:29 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:36.979 08:26:29 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:36.979 08:26:29 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:36.979 08:26:29 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:37.239 08:26:29 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.ecPko4 00:08:37.239 08:26:29 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:37.239 08:26:29 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:37.239 08:26:29 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:37.239 08:26:29 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.ecPko4/tests/nvmf /tmp/spdk.ecPko4 00:08:37.239 08:26:29 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:37.239 08:26:29 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:37.239 08:26:29 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:37.239 08:26:29 -- common/autotest_common.sh@318 -- # df -T 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:37.239 08:26:29 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:37.239 08:26:29 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # avails["$mount"]=678330368 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:37.239 08:26:29 -- common/autotest_common.sh@354 -- # uses["$mount"]=4606099456 00:08:37.239 08:26:29 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # avails["$mount"]=52133462016 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61730590720 00:08:37.239 08:26:29 -- common/autotest_common.sh@354 -- # uses["$mount"]=9597128704 00:08:37.239 08:26:29 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # avails["$mount"]=30864035840 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30865293312 00:08:37.239 08:26:29 -- common/autotest_common.sh@354 -- # uses["$mount"]=1257472 00:08:37.239 08:26:29 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # avails["$mount"]=12340125696 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12346118144 00:08:37.239 08:26:29 -- common/autotest_common.sh@354 -- # uses["$mount"]=5992448 00:08:37.239 08:26:29 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # avails["$mount"]=30864527360 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30865297408 00:08:37.239 08:26:29 -- common/autotest_common.sh@354 -- # uses["$mount"]=770048 00:08:37.239 08:26:29 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # avails["$mount"]=6173044736 00:08:37.239 08:26:29 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6173057024 00:08:37.239 08:26:29 -- common/autotest_common.sh@354 -- # uses["$mount"]=12288 00:08:37.239 08:26:29 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:37.239 08:26:29 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:37.239 * Looking for test storage... 00:08:37.239 08:26:29 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:37.239 08:26:29 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:37.239 08:26:29 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:37.239 08:26:29 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:37.239 08:26:29 -- common/autotest_common.sh@363 -- # mount=/ 00:08:37.239 08:26:29 -- common/autotest_common.sh@365 -- # target_space=52133462016 00:08:37.239 08:26:29 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:37.239 08:26:29 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:37.239 08:26:29 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:37.239 08:26:29 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:37.239 08:26:29 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:37.239 08:26:29 -- common/autotest_common.sh@372 -- # new_size=11811721216 00:08:37.239 08:26:29 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:37.239 08:26:29 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:37.239 08:26:29 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:37.239 08:26:29 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:37.239 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:37.239 08:26:29 -- common/autotest_common.sh@380 -- # return 0 00:08:37.239 08:26:29 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:37.239 08:26:29 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:37.239 08:26:29 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:37.239 08:26:29 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:37.239 08:26:29 -- common/autotest_common.sh@1672 -- # true 00:08:37.239 08:26:29 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:37.239 08:26:29 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:37.239 08:26:29 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:37.239 08:26:29 -- common/autotest_common.sh@27 -- # exec 00:08:37.239 08:26:29 -- common/autotest_common.sh@29 -- # exec 00:08:37.239 08:26:29 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:37.239 08:26:29 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:37.239 08:26:29 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:37.239 08:26:29 -- common/autotest_common.sh@18 -- # set -x 00:08:37.239 08:26:29 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:08:37.239 08:26:29 -- ../common.sh@8 -- # pids=() 00:08:37.239 08:26:29 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:37.239 08:26:29 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:37.239 08:26:29 -- nvmf/run.sh@56 -- # fuzz_num=25 00:08:37.239 08:26:29 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:08:37.239 08:26:29 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:08:37.239 08:26:29 -- nvmf/run.sh@61 -- # mem_size=512 00:08:37.239 08:26:29 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:08:37.239 08:26:29 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:08:37.239 08:26:29 -- ../common.sh@69 -- # local fuzz_num=25 00:08:37.239 08:26:29 -- ../common.sh@70 -- # local time=1 00:08:37.239 08:26:29 -- ../common.sh@72 -- # (( i = 0 )) 00:08:37.239 08:26:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.239 08:26:29 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:37.239 08:26:29 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:08:37.239 08:26:29 -- nvmf/run.sh@24 -- # local timen=1 00:08:37.239 08:26:29 -- nvmf/run.sh@25 -- # local core=0x1 00:08:37.239 08:26:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:37.239 08:26:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:08:37.239 08:26:29 -- nvmf/run.sh@29 -- # printf %02d 0 00:08:37.239 08:26:29 -- nvmf/run.sh@29 -- # port=4400 00:08:37.239 08:26:29 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:37.239 08:26:29 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:08:37.239 08:26:29 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:37.239 08:26:29 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:08:37.239 [2024-10-04 08:26:29.746698] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:37.239 [2024-10-04 08:26:29.746768] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1012004 ] 00:08:37.239 EAL: No free 2048 kB hugepages reported on node 1 00:08:37.498 [2024-10-04 08:26:29.928007] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.498 [2024-10-04 08:26:29.947427] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:37.498 [2024-10-04 08:26:29.947552] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.498 [2024-10-04 08:26:29.999095] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:37.498 [2024-10-04 08:26:30.015457] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:08:37.498 INFO: Running with entropic power schedule (0xFF, 100). 00:08:37.498 INFO: Seed: 2506689368 00:08:37.498 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:37.498 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:37.498 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:37.498 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.498 #2 INITED exec/s: 0 rss: 59Mb 00:08:37.498 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.498 This may also happen if the target rejected all inputs we tried so far 00:08:37.498 [2024-10-04 08:26:30.085455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:37.498 [2024-10-04 08:26:30.085493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.757 NEW_FUNC[1/670]: 0x451418 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:08:37.757 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.757 #35 NEW cov: 11550 ft: 11551 corp: 2/67b lim: 320 exec/s: 0 rss: 67Mb L: 66/66 MS: 3 InsertRepeatedBytes-CMP-CopyPart- DE: "\000h\340m\016\022XR"- 00:08:37.757 [2024-10-04 08:26:30.406217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:37.757 [2024-10-04 08:26:30.406266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.757 #36 NEW cov: 11663 ft: 12264 corp: 3/170b lim: 320 exec/s: 0 rss: 68Mb L: 103/103 MS: 1 CopyPart- 00:08:38.016 [2024-10-04 08:26:30.466375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.016 [2024-10-04 08:26:30.466403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.016 #37 NEW cov: 11669 ft: 12491 corp: 4/236b lim: 320 exec/s: 0 rss: 68Mb L: 66/103 MS: 1 CopyPart- 00:08:38.016 [2024-10-04 08:26:30.516457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e0068e06cc9 00:08:38.016 [2024-10-04 08:26:30.516485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.016 #39 NEW cov: 11754 ft: 12748 corp: 5/306b lim: 320 exec/s: 0 rss: 68Mb L: 70/103 MS: 2 EraseBytes-CMP- DE: "er&\311l\340h\000"- 00:08:38.016 [2024-10-04 08:26:30.576665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.016 [2024-10-04 08:26:30.576694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.016 #40 NEW cov: 11754 ft: 12833 corp: 6/409b lim: 320 exec/s: 0 rss: 68Mb L: 103/103 MS: 1 ChangeBit- 00:08:38.016 [2024-10-04 08:26:30.626871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f5) qid:0 cid:4 nsid:f5f5f5f5 cdw10:f5f5f5f5 cdw11:f5f5f5f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.016 [2024-10-04 08:26:30.626898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.016 NEW_FUNC[1/1]: 0x16dd468 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:08:38.016 #41 NEW cov: 11767 ft: 13249 corp: 7/489b lim: 320 exec/s: 0 rss: 68Mb L: 80/103 MS: 1 InsertRepeatedBytes- 00:08:38.016 [2024-10-04 08:26:30.666857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.016 [2024-10-04 08:26:30.666885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.016 #42 NEW cov: 11767 ft: 13434 corp: 8/555b lim: 320 exec/s: 0 rss: 68Mb L: 66/103 MS: 1 ShuffleBytes- 00:08:38.275 [2024-10-04 08:26:30.717146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (76) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.275 [2024-10-04 08:26:30.717175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.275 #48 NEW cov: 11767 ft: 13447 corp: 9/621b lim: 320 exec/s: 0 rss: 68Mb L: 66/103 MS: 1 ChangeBit- 00:08:38.275 [2024-10-04 08:26:30.757308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:79797979 cdw11:79797979 SGL TRANSPORT DATA BLOCK TRANSPORT 0x797979797e7e7e7e 00:08:38.275 [2024-10-04 08:26:30.757341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.275 [2024-10-04 08:26:30.757476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.275 [2024-10-04 08:26:30.757494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.275 #49 NEW cov: 11767 ft: 13752 corp: 10/755b lim: 320 exec/s: 0 rss: 68Mb L: 134/134 MS: 1 InsertRepeatedBytes- 00:08:38.276 [2024-10-04 08:26:30.807394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (76) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x68e06cc9267265 00:08:38.276 [2024-10-04 08:26:30.807421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.276 #50 NEW cov: 11767 ft: 13776 corp: 11/829b lim: 320 exec/s: 0 rss: 68Mb L: 74/134 MS: 1 PersAutoDict- DE: "er&\311l\340h\000"- 00:08:38.276 [2024-10-04 08:26:30.857812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.276 [2024-10-04 08:26:30.857839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.276 [2024-10-04 08:26:30.857983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:47474747 cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e474747474747 00:08:38.276 [2024-10-04 08:26:30.857999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.276 #51 NEW cov: 11767 ft: 13824 corp: 12/959b lim: 320 exec/s: 0 rss: 68Mb L: 130/134 MS: 1 InsertRepeatedBytes- 00:08:38.276 [2024-10-04 08:26:30.897227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.276 [2024-10-04 08:26:30.897254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.276 #52 NEW cov: 11767 ft: 13879 corp: 13/1025b lim: 320 exec/s: 0 rss: 68Mb L: 66/134 MS: 1 ChangeBinInt- 00:08:38.276 [2024-10-04 08:26:30.937677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (76) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x68e06cc9267265 00:08:38.276 [2024-10-04 08:26:30.937704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:38.535 #53 NEW cov: 11790 ft: 13941 corp: 14/1099b lim: 320 exec/s: 0 rss: 68Mb L: 74/134 MS: 1 ShuffleBytes- 00:08:38.535 [2024-10-04 08:26:30.988029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7efe cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.535 [2024-10-04 08:26:30.988055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 [2024-10-04 08:26:30.988208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:47474747 cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e474747474747 00:08:38.535 [2024-10-04 08:26:30.988225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.535 #54 NEW cov: 11790 ft: 14039 corp: 15/1229b lim: 320 exec/s: 0 rss: 68Mb L: 130/134 MS: 1 ChangeBit- 00:08:38.535 [2024-10-04 08:26:31.027982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:81807e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e0068e06cc9 00:08:38.535 [2024-10-04 08:26:31.028015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 #55 NEW cov: 11790 ft: 14083 corp: 16/1299b lim: 320 exec/s: 0 rss: 68Mb L: 70/134 MS: 1 ChangeBinInt- 00:08:38.535 [2024-10-04 08:26:31.078167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (76) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x68e06cc9267265 00:08:38.535 [2024-10-04 08:26:31.078196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 #56 NEW cov: 11790 ft: 14143 corp: 17/1373b lim: 320 exec/s: 56 rss: 69Mb L: 74/134 MS: 1 ChangeBit- 00:08:38.535 [2024-10-04 08:26:31.128330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.535 [2024-10-04 08:26:31.128356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 #57 NEW cov: 11790 ft: 14158 corp: 18/1439b lim: 320 exec/s: 57 rss: 69Mb L: 66/134 MS: 1 ChangeBit- 00:08:38.535 [2024-10-04 08:26:31.178484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.535 [2024-10-04 08:26:31.178511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.535 #58 NEW cov: 11790 ft: 14168 corp: 19/1505b lim: 320 exec/s: 58 rss: 69Mb L: 66/134 MS: 1 ChangeByte- 00:08:38.794 [2024-10-04 08:26:31.218485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (76) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x68e06cc9267265 00:08:38.794 [2024-10-04 08:26:31.218512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.794 #59 NEW cov: 11790 ft: 14241 corp: 20/1579b lim: 320 exec/s: 59 rss: 69Mb L: 74/134 MS: 1 PersAutoDict- DE: "\000h\340m\016\022XR"- 00:08:38.794 [2024-10-04 08:26:31.258764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.794 [2024-10-04 08:26:31.258792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.794 #60 NEW cov: 11790 ft: 14250 corp: 21/1645b lim: 320 exec/s: 60 rss: 69Mb L: 66/134 MS: 1 ChangeByte- 00:08:38.794 [2024-10-04 08:26:31.299091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e0068e06cc9 00:08:38.794 [2024-10-04 08:26:31.299117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.794 [2024-10-04 08:26:31.299255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (72) qid:0 cid:5 nsid:7e0068e0 cdw10:120e6de0 cdw11:7e7e5258 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:38.794 [2024-10-04 08:26:31.299272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.794 #61 NEW cov: 11790 ft: 14257 corp: 22/1773b lim: 320 exec/s: 61 rss: 69Mb L: 128/134 MS: 1 CrossOver- 00:08:38.794 [2024-10-04 08:26:31.338555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (76) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e6e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x5258120e6de06800 00:08:38.794 [2024-10-04 08:26:31.338584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.794 #62 NEW cov: 11790 ft: 14261 corp: 23/1855b lim: 320 exec/s: 62 rss: 69Mb L: 82/134 MS: 1 PersAutoDict- DE: "\000h\340m\016\022XR"- 00:08:38.794 [2024-10-04 08:26:31.378640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (76) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0xe06cc92672657265 00:08:38.794 [2024-10-04 08:26:31.378670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.794 #63 NEW cov: 11790 ft: 14338 corp: 24/1929b lim: 320 exec/s: 63 rss: 69Mb L: 74/134 MS: 1 PersAutoDict- DE: "er&\311l\340h\000"- 00:08:38.794 [2024-10-04 08:26:31.429222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:767e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x6cc92672657e7e7e 00:08:38.794 [2024-10-04 08:26:31.429250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.794 #64 NEW cov: 11790 ft: 14379 corp: 25/2040b lim: 320 exec/s: 64 rss: 69Mb L: 111/134 MS: 1 PersAutoDict- DE: "er&\311l\340h\000"- 00:08:39.053 [2024-10-04 08:26:31.479430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e0068e06cc926 00:08:39.053 [2024-10-04 08:26:31.479457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.053 #65 NEW cov: 11790 ft: 14410 corp: 26/2111b lim: 320 exec/s: 65 rss: 69Mb L: 71/134 MS: 1 InsertByte- 00:08:39.053 [2024-10-04 08:26:31.519735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e0068e06cc9 00:08:39.053 [2024-10-04 08:26:31.519761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.053 [2024-10-04 08:26:31.519911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (72) qid:0 cid:5 nsid:7e0068e0 cdw10:120e6de0 cdw11:7e7e5258 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.053 [2024-10-04 08:26:31.519927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.053 #66 NEW cov: 11790 ft: 14472 corp: 27/2239b lim: 320 exec/s: 66 rss: 69Mb L: 128/134 MS: 1 ChangeBinInt- 00:08:39.053 [2024-10-04 08:26:31.559258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.053 [2024-10-04 08:26:31.559285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.053 #67 NEW cov: 11790 ft: 14515 corp: 28/2305b lim: 320 exec/s: 67 rss: 69Mb L: 66/134 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:39.053 [2024-10-04 08:26:31.599894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.053 [2024-10-04 08:26:31.599923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.053 [2024-10-04 08:26:31.600054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:47474747 cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e4747 00:08:39.053 [2024-10-04 08:26:31.600070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.053 #68 NEW cov: 11790 ft: 14551 corp: 29/2464b lim: 320 exec/s: 68 rss: 69Mb L: 159/159 MS: 1 CrossOver- 00:08:39.053 [2024-10-04 08:26:31.639517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:767e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x6cc92672657e7e7e 00:08:39.053 [2024-10-04 08:26:31.639545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.053 #69 NEW cov: 11790 ft: 14568 corp: 30/2575b lim: 320 exec/s: 69 rss: 69Mb L: 111/159 MS: 1 ChangeBinInt- 00:08:39.053 [2024-10-04 08:26:31.679939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e0068e06cc9 00:08:39.053 [2024-10-04 08:26:31.679969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.053 #70 NEW cov: 11790 ft: 14573 corp: 31/2645b lim: 320 exec/s: 70 rss: 69Mb L: 70/159 MS: 1 EraseBytes- 00:08:39.053 [2024-10-04 08:26:31.719873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7efe cdw10:7e7e7e7e cdw11:7e7e0082 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.053 [2024-10-04 08:26:31.719901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.053 [2024-10-04 08:26:31.720030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:47474747 cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e474747474747 00:08:39.053 [2024-10-04 08:26:31.720046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.311 #71 NEW cov: 11790 ft: 14583 corp: 32/2775b lim: 320 exec/s: 71 rss: 69Mb L: 130/159 MS: 1 ChangeBinInt- 00:08:39.311 [2024-10-04 08:26:31.770020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:ff7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.311 [2024-10-04 08:26:31.770048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 [2024-10-04 08:26:31.770197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:08:39.311 [2024-10-04 08:26:31.770214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.311 NEW_FUNC[1/1]: 0x12de638 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:08:39.311 #72 NEW cov: 11821 ft: 14622 corp: 33/2916b lim: 320 exec/s: 72 rss: 69Mb L: 141/159 MS: 1 InsertRepeatedBytes- 00:08:39.311 [2024-10-04 08:26:31.830609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7efe cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.311 [2024-10-04 08:26:31.830635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 [2024-10-04 08:26:31.830792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:47474747 cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e47474747474747 00:08:39.311 [2024-10-04 08:26:31.830808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.311 #73 NEW cov: 11821 ft: 14631 corp: 34/3047b lim: 320 exec/s: 73 rss: 69Mb L: 131/159 MS: 1 InsertByte- 00:08:39.311 [2024-10-04 08:26:31.870146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:81807e7e cdw10:7e7e7e7e cdw11:7e7e7e45 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e0068e06cc9 00:08:39.311 [2024-10-04 08:26:31.870174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 #74 NEW cov: 11821 ft: 14636 corp: 35/3118b lim: 320 exec/s: 74 rss: 70Mb L: 71/159 MS: 1 InsertByte- 00:08:39.311 [2024-10-04 08:26:31.920743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e0068e06cc926 00:08:39.311 [2024-10-04 08:26:31.920772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 #75 NEW cov: 11821 ft: 14663 corp: 36/3189b lim: 320 exec/s: 75 rss: 70Mb L: 71/159 MS: 1 ChangeBinInt- 00:08:39.311 [2024-10-04 08:26:31.960565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:79797979 cdw11:79797979 SGL TRANSPORT DATA BLOCK TRANSPORT 0x797979797e7e7e7e 00:08:39.311 [2024-10-04 08:26:31.960597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.311 [2024-10-04 08:26:31.960731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.311 [2024-10-04 08:26:31.960749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.311 #76 NEW cov: 11821 ft: 14668 corp: 37/3323b lim: 320 exec/s: 76 rss: 70Mb L: 134/159 MS: 1 ChangeByte- 00:08:39.570 [2024-10-04 08:26:32.021622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:79797979 cdw11:79797979 SGL TRANSPORT DATA BLOCK TRANSPORT 0x797979797e7e7e7e 00:08:39.570 [2024-10-04 08:26:32.021649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.570 [2024-10-04 08:26:32.021794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.571 [2024-10-04 08:26:32.021811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.571 [2024-10-04 08:26:32.021947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:6 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.571 [2024-10-04 08:26:32.021965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.571 [2024-10-04 08:26:32.022098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:7 nsid:47474747 cdw10:7e7e7e7e cdw11:007e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:08:39.571 [2024-10-04 08:26:32.022117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.571 #77 NEW cov: 11821 ft: 15141 corp: 38/3596b lim: 320 exec/s: 77 rss: 70Mb L: 273/273 MS: 1 CrossOver- 00:08:39.571 [2024-10-04 08:26:32.071146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:4 nsid:7e7e7e7e cdw10:7e7e7e7e cdw11:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e0068e06cc92672 00:08:39.571 [2024-10-04 08:26:32.071172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.571 #83 NEW cov: 11821 ft: 15162 corp: 39/3668b lim: 320 exec/s: 41 rss: 70Mb L: 72/273 MS: 1 InsertByte- 00:08:39.571 #83 DONE cov: 11821 ft: 15162 corp: 39/3668b lim: 320 exec/s: 41 rss: 70Mb 00:08:39.571 ###### Recommended dictionary. ###### 00:08:39.571 "\000h\340m\016\022XR" # Uses: 2 00:08:39.571 "er&\311l\340h\000" # Uses: 3 00:08:39.571 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:39.571 ###### End of recommended dictionary. ###### 00:08:39.571 Done 83 runs in 2 second(s) 00:08:39.571 08:26:32 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:08:39.571 08:26:32 -- ../common.sh@72 -- # (( i++ )) 00:08:39.571 08:26:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.571 08:26:32 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:39.571 08:26:32 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:08:39.571 08:26:32 -- nvmf/run.sh@24 -- # local timen=1 00:08:39.571 08:26:32 -- nvmf/run.sh@25 -- # local core=0x1 00:08:39.571 08:26:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:39.571 08:26:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:08:39.571 08:26:32 -- nvmf/run.sh@29 -- # printf %02d 1 00:08:39.571 08:26:32 -- nvmf/run.sh@29 -- # port=4401 00:08:39.571 08:26:32 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:39.571 08:26:32 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:08:39.571 08:26:32 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:39.571 08:26:32 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:08:39.571 [2024-10-04 08:26:32.247383] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:39.571 [2024-10-04 08:26:32.247476] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1012471 ] 00:08:39.830 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.830 [2024-10-04 08:26:32.424261] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.830 [2024-10-04 08:26:32.443064] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:39.830 [2024-10-04 08:26:32.443184] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.830 [2024-10-04 08:26:32.494404] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:39.830 [2024-10-04 08:26:32.510736] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:08:40.088 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.088 INFO: Seed: 708571139 00:08:40.088 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:40.088 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:40.088 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:40.088 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.089 #2 INITED exec/s: 0 rss: 59Mb 00:08:40.089 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.089 This may also happen if the target rejected all inputs we tried so far 00:08:40.089 [2024-10-04 08:26:32.586605] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.089 [2024-10-04 08:26:32.587126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.089 [2024-10-04 08:26:32.587162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.089 [2024-10-04 08:26:32.587293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.089 [2024-10-04 08:26:32.587311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.348 NEW_FUNC[1/671]: 0x451d18 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:08:40.348 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.348 #15 NEW cov: 11660 ft: 11648 corp: 2/18b lim: 30 exec/s: 0 rss: 67Mb L: 17/17 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:08:40.348 [2024-10-04 08:26:32.907619] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.348 [2024-10-04 08:26:32.908215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.348 [2024-10-04 08:26:32.908254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.348 [2024-10-04 08:26:32.908390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.348 [2024-10-04 08:26:32.908408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.348 #16 NEW cov: 11773 ft: 12200 corp: 3/35b lim: 30 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 CopyPart- 00:08:40.348 [2024-10-04 08:26:32.967830] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.348 [2024-10-04 08:26:32.968173] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:08:40.348 [2024-10-04 08:26:32.968577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.348 [2024-10-04 08:26:32.968611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.348 [2024-10-04 08:26:32.968760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.348 [2024-10-04 08:26:32.968780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.348 [2024-10-04 08:26:32.968921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.348 [2024-10-04 08:26:32.968940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.348 #17 NEW cov: 11779 ft: 12860 corp: 4/55b lim: 30 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:08:40.348 [2024-10-04 08:26:33.028056] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.348 [2024-10-04 08:26:33.028391] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:40.348 [2024-10-04 08:26:33.028776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.348 [2024-10-04 08:26:33.028806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.348 [2024-10-04 08:26:33.028899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.348 [2024-10-04 08:26:33.028919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.348 [2024-10-04 08:26:33.029061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.348 [2024-10-04 08:26:33.029080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.607 #18 NEW cov: 11870 ft: 13125 corp: 5/76b lim: 30 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 InsertByte- 00:08:40.607 [2024-10-04 08:26:33.088041] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.607 [2024-10-04 08:26:33.088420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.607 [2024-10-04 08:26:33.088451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.607 #19 NEW cov: 11870 ft: 13597 corp: 6/87b lim: 30 exec/s: 0 rss: 67Mb L: 11/21 MS: 1 EraseBytes- 00:08:40.607 [2024-10-04 08:26:33.138413] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.607 [2024-10-04 08:26:33.138766] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:40.607 [2024-10-04 08:26:33.139175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.607 [2024-10-04 08:26:33.139210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.607 [2024-10-04 08:26:33.139349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.607 [2024-10-04 08:26:33.139370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.607 [2024-10-04 08:26:33.139519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00050296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.608 [2024-10-04 08:26:33.139538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.608 #20 NEW cov: 11870 ft: 13668 corp: 7/108b lim: 30 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ChangeBinInt- 00:08:40.608 [2024-10-04 08:26:33.198589] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.608 [2024-10-04 08:26:33.198911] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524524) > buf size (4096) 00:08:40.608 [2024-10-04 08:26:33.199291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.608 [2024-10-04 08:26:33.199324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.608 [2024-10-04 08:26:33.199462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.608 [2024-10-04 08:26:33.199480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.608 [2024-10-04 08:26:33.199623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:003a0296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.608 [2024-10-04 08:26:33.199641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.608 #21 NEW cov: 11870 ft: 13701 corp: 8/128b lim: 30 exec/s: 0 rss: 67Mb L: 20/21 MS: 1 ChangeByte- 00:08:40.608 [2024-10-04 08:26:33.248678] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.608 [2024-10-04 08:26:33.249016] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200009696 00:08:40.608 [2024-10-04 08:26:33.249403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.608 [2024-10-04 08:26:33.249433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.608 [2024-10-04 08:26:33.249575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.608 [2024-10-04 08:26:33.249596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.608 [2024-10-04 08:26:33.249737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0000023a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.608 [2024-10-04 08:26:33.249756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.608 #22 NEW cov: 11870 ft: 13729 corp: 9/149b lim: 30 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 InsertByte- 00:08:40.867 [2024-10-04 08:26:33.308934] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.867 [2024-10-04 08:26:33.309271] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524524) > buf size (4096) 00:08:40.867 [2024-10-04 08:26:33.309650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.309680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.309833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.309855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.310006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:003a0294 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.310024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.867 #23 NEW cov: 11870 ft: 13752 corp: 10/169b lim: 30 exec/s: 0 rss: 67Mb L: 20/21 MS: 1 ChangeBit- 00:08:40.867 [2024-10-04 08:26:33.359151] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.867 [2024-10-04 08:26:33.359493] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:40.867 [2024-10-04 08:26:33.359671] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524892) > buf size (4096) 00:08:40.867 [2024-10-04 08:26:33.360051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.360081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.360221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.360242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.360375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00050296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.360394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.360528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00960296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.360548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.867 #24 NEW cov: 11870 ft: 14278 corp: 11/196b lim: 30 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 CrossOver- 00:08:40.867 [2024-10-04 08:26:33.419095] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:40.867 [2024-10-04 08:26:33.419480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.419510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.867 #26 NEW cov: 11870 ft: 14354 corp: 12/207b lim: 30 exec/s: 0 rss: 68Mb L: 11/27 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:40.867 [2024-10-04 08:26:33.469469] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.867 [2024-10-04 08:26:33.469645] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (2560) > len (4) 00:08:40.867 [2024-10-04 08:26:33.470313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.470345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.470485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.470507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.470650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.470672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.470826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.470843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.867 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:40.867 #27 NEW cov: 11899 ft: 14441 corp: 13/234b lim: 30 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 CopyPart- 00:08:40.867 [2024-10-04 08:26:33.519433] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:40.867 [2024-10-04 08:26:33.519614] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (34820) > buf size (4096) 00:08:40.867 [2024-10-04 08:26:33.519991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.520021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.867 [2024-10-04 08:26:33.520162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:22000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.867 [2024-10-04 08:26:33.520181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.867 #28 NEW cov: 11899 ft: 14503 corp: 14/251b lim: 30 exec/s: 0 rss: 68Mb L: 17/27 MS: 1 ChangeByte- 00:08:41.126 [2024-10-04 08:26:33.569714] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.126 [2024-10-04 08:26:33.570038] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:08:41.126 [2024-10-04 08:26:33.570432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.570463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.570604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.570623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.570759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.570778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.126 #29 NEW cov: 11899 ft: 14511 corp: 15/269b lim: 30 exec/s: 29 rss: 68Mb L: 18/27 MS: 1 CrossOver- 00:08:41.126 [2024-10-04 08:26:33.619895] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.126 [2024-10-04 08:26:33.620235] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:41.126 [2024-10-04 08:26:33.620601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.620635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.620780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.620803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.620945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.620964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.126 #30 NEW cov: 11899 ft: 14535 corp: 16/290b lim: 30 exec/s: 30 rss: 68Mb L: 21/27 MS: 1 CopyPart- 00:08:41.126 [2024-10-04 08:26:33.670027] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.126 [2024-10-04 08:26:33.670364] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:41.126 [2024-10-04 08:26:33.670787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.670819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.670962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.670983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.671150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00050296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.671166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.126 #31 NEW cov: 11899 ft: 14551 corp: 17/311b lim: 30 exec/s: 31 rss: 68Mb L: 21/27 MS: 1 CMP- DE: "\370\000\000\000"- 00:08:41.126 [2024-10-04 08:26:33.720292] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.126 [2024-10-04 08:26:33.720641] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:41.126 [2024-10-04 08:26:33.721038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.721069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.721212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.721232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.721383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00050296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.721401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.126 #32 NEW cov: 11899 ft: 14562 corp: 18/332b lim: 30 exec/s: 32 rss: 68Mb L: 21/27 MS: 1 ChangeByte- 00:08:41.126 [2024-10-04 08:26:33.780623] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10412) > buf size (4096) 00:08:41.126 [2024-10-04 08:26:33.780954] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:41.126 [2024-10-04 08:26:33.781141] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524892) > buf size (4096) 00:08:41.126 [2024-10-04 08:26:33.781556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a2a000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.781590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.781739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.781761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.126 [2024-10-04 08:26:33.781904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00050296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.126 [2024-10-04 08:26:33.781922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.127 [2024-10-04 08:26:33.782069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00960296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.127 [2024-10-04 08:26:33.782088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.386 #33 NEW cov: 11899 ft: 14596 corp: 19/359b lim: 30 exec/s: 33 rss: 68Mb L: 27/27 MS: 1 ChangeByte- 00:08:41.386 [2024-10-04 08:26:33.840368] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.386 [2024-10-04 08:26:33.840783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:33.840813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.386 #34 NEW cov: 11899 ft: 14610 corp: 20/370b lim: 30 exec/s: 34 rss: 68Mb L: 11/27 MS: 1 CrossOver- 00:08:41.386 [2024-10-04 08:26:33.890855] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.386 [2024-10-04 08:26:33.891040] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (24576) > len (4) 00:08:41.386 [2024-10-04 08:26:33.891206] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200009632 00:08:41.386 [2024-10-04 08:26:33.891630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:33.891661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.386 [2024-10-04 08:26:33.891795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:33.891815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.386 [2024-10-04 08:26:33.891951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000205 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:33.891970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.386 #35 NEW cov: 11899 ft: 14619 corp: 21/392b lim: 30 exec/s: 35 rss: 68Mb L: 22/27 MS: 1 InsertByte- 00:08:41.386 [2024-10-04 08:26:33.941058] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11236) > buf size (4096) 00:08:41.386 [2024-10-04 08:26:33.941240] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.386 [2024-10-04 08:26:33.941573] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (584284) > buf size (4096) 00:08:41.386 [2024-10-04 08:26:33.941950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0af80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:33.941979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.386 [2024-10-04 08:26:33.942116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0a000083 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:33.942132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.386 [2024-10-04 08:26:33.942280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:33.942298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.386 [2024-10-04 08:26:33.942447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:3a960296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:33.942464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.386 #41 NEW cov: 11899 ft: 14639 corp: 22/417b lim: 30 exec/s: 41 rss: 68Mb L: 25/27 MS: 1 PersAutoDict- DE: "\370\000\000\000"- 00:08:41.386 [2024-10-04 08:26:34.001233] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.386 [2024-10-04 08:26:34.001420] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (27756) > len (4) 00:08:41.386 [2024-10-04 08:26:34.001583] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (111028) > buf size (4096) 00:08:41.386 [2024-10-04 08:26:34.001941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:34.001975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.386 [2024-10-04 08:26:34.002114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:34.002134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.386 [2024-10-04 08:26:34.002284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:6c6c006c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.386 [2024-10-04 08:26:34.002303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.387 #42 NEW cov: 11899 ft: 14659 corp: 23/439b lim: 30 exec/s: 42 rss: 68Mb L: 22/27 MS: 1 InsertRepeatedBytes- 00:08:41.387 [2024-10-04 08:26:34.061411] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.387 [2024-10-04 08:26:34.061745] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003196 00:08:41.387 [2024-10-04 08:26:34.062129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-10-04 08:26:34.062161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.387 [2024-10-04 08:26:34.062301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-10-04 08:26:34.062319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.387 [2024-10-04 08:26:34.062456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.387 [2024-10-04 08:26:34.062474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.646 #43 NEW cov: 11899 ft: 14669 corp: 24/460b lim: 30 exec/s: 43 rss: 68Mb L: 21/27 MS: 1 ChangeASCIIInt- 00:08:41.646 [2024-10-04 08:26:34.121650] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.646 [2024-10-04 08:26:34.122007] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:41.646 [2024-10-04 08:26:34.122410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.122446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.122597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.122616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.122761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.122779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.646 #44 NEW cov: 11899 ft: 14676 corp: 25/481b lim: 30 exec/s: 44 rss: 68Mb L: 21/27 MS: 1 ChangeBit- 00:08:41.646 [2024-10-04 08:26:34.171910] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.646 [2024-10-04 08:26:34.172088] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:41.646 [2024-10-04 08:26:34.172428] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (584276) > buf size (4096) 00:08:41.646 [2024-10-04 08:26:34.172804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.172838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.172978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.173000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.173134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.173153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.173283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:3a940296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.173303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.646 #45 NEW cov: 11899 ft: 14682 corp: 26/506b lim: 30 exec/s: 45 rss: 68Mb L: 25/27 MS: 1 InsertRepeatedBytes- 00:08:41.646 [2024-10-04 08:26:34.222077] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10412) > buf size (4096) 00:08:41.646 [2024-10-04 08:26:34.222283] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (65540) > buf size (4096) 00:08:41.646 [2024-10-04 08:26:34.222453] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:41.646 [2024-10-04 08:26:34.222622] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524892) > buf size (4096) 00:08:41.646 [2024-10-04 08:26:34.223041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a2a000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.223071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.223192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:40000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.223211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.223346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00050296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.223368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.223503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00960296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.223522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.646 #46 NEW cov: 11899 ft: 14747 corp: 27/533b lim: 30 exec/s: 46 rss: 68Mb L: 27/27 MS: 1 ChangeBit- 00:08:41.646 [2024-10-04 08:26:34.282099] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.646 [2024-10-04 08:26:34.282276] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:08:41.646 [2024-10-04 08:26:34.282665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.282695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.646 [2024-10-04 08:26:34.282832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.646 [2024-10-04 08:26:34.282849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.646 #47 NEW cov: 11899 ft: 14750 corp: 28/550b lim: 30 exec/s: 47 rss: 68Mb L: 17/27 MS: 1 ChangeBinInt- 00:08:41.905 [2024-10-04 08:26:34.332311] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.905 [2024-10-04 08:26:34.333040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.905 [2024-10-04 08:26:34.333071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.905 [2024-10-04 08:26:34.333215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.905 [2024-10-04 08:26:34.333236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.905 [2024-10-04 08:26:34.333388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.905 [2024-10-04 08:26:34.333406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.905 #48 NEW cov: 11899 ft: 14774 corp: 29/571b lim: 30 exec/s: 48 rss: 68Mb L: 21/27 MS: 1 CMP- DE: "\001\000\000\030"- 00:08:41.905 [2024-10-04 08:26:34.382315] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.905 [2024-10-04 08:26:34.382713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.905 [2024-10-04 08:26:34.382760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.905 #49 NEW cov: 11899 ft: 14781 corp: 30/582b lim: 30 exec/s: 49 rss: 69Mb L: 11/27 MS: 1 EraseBytes- 00:08:41.905 [2024-10-04 08:26:34.432810] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.905 [2024-10-04 08:26:34.433145] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003296 00:08:41.905 [2024-10-04 08:26:34.433329] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008096 00:08:41.905 [2024-10-04 08:26:34.433728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.905 [2024-10-04 08:26:34.433765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.905 [2024-10-04 08:26:34.433901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.905 [2024-10-04 08:26:34.433920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.434059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00050296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.434078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.434216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00960296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.434238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.906 #50 NEW cov: 11899 ft: 14801 corp: 31/610b lim: 30 exec/s: 50 rss: 69Mb L: 28/28 MS: 1 InsertByte- 00:08:41.906 [2024-10-04 08:26:34.483039] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.906 [2024-10-04 08:26:34.483536] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200009632 00:08:41.906 [2024-10-04 08:26:34.483929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.483959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.484105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.484126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.484271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.484292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.484425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000205 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.484446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.906 #51 NEW cov: 11899 ft: 14875 corp: 32/638b lim: 30 exec/s: 51 rss: 69Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:41.906 [2024-10-04 08:26:34.533122] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.906 [2024-10-04 08:26:34.533482] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524524) > buf size (4096) 00:08:41.906 [2024-10-04 08:26:34.533877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.533909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.534042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.534065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.534205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:003a0296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.534228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.906 #52 NEW cov: 11899 ft: 14893 corp: 33/658b lim: 30 exec/s: 52 rss: 69Mb L: 20/28 MS: 1 ShuffleBytes- 00:08:41.906 [2024-10-04 08:26:34.583563] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:08:41.906 [2024-10-04 08:26:34.583902] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (584284) > buf size (4096) 00:08:41.906 [2024-10-04 08:26:34.584324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.584355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.584503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0a000083 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.584521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.584666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.584687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.906 [2024-10-04 08:26:34.584834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:3a960296 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.906 [2024-10-04 08:26:34.584853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.165 #53 NEW cov: 11899 ft: 14912 corp: 34/683b lim: 30 exec/s: 26 rss: 69Mb L: 25/28 MS: 1 PersAutoDict- DE: "\001\000\000\030"- 00:08:42.165 #53 DONE cov: 11899 ft: 14912 corp: 34/683b lim: 30 exec/s: 26 rss: 69Mb 00:08:42.165 ###### Recommended dictionary. ###### 00:08:42.165 "\370\000\000\000" # Uses: 1 00:08:42.165 "\001\000\000\030" # Uses: 1 00:08:42.165 ###### End of recommended dictionary. ###### 00:08:42.165 Done 53 runs in 2 second(s) 00:08:42.165 08:26:34 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:08:42.165 08:26:34 -- ../common.sh@72 -- # (( i++ )) 00:08:42.165 08:26:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.165 08:26:34 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:42.165 08:26:34 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:08:42.165 08:26:34 -- nvmf/run.sh@24 -- # local timen=1 00:08:42.165 08:26:34 -- nvmf/run.sh@25 -- # local core=0x1 00:08:42.165 08:26:34 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:42.165 08:26:34 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:08:42.165 08:26:34 -- nvmf/run.sh@29 -- # printf %02d 2 00:08:42.165 08:26:34 -- nvmf/run.sh@29 -- # port=4402 00:08:42.165 08:26:34 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:42.166 08:26:34 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:08:42.166 08:26:34 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:42.166 08:26:34 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:08:42.166 [2024-10-04 08:26:34.767415] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:42.166 [2024-10-04 08:26:34.767488] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1013008 ] 00:08:42.166 EAL: No free 2048 kB hugepages reported on node 1 00:08:42.425 [2024-10-04 08:26:34.943441] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.425 [2024-10-04 08:26:34.962333] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:42.425 [2024-10-04 08:26:34.962452] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.425 [2024-10-04 08:26:35.013711] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:42.425 [2024-10-04 08:26:35.030031] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:08:42.425 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.425 INFO: Seed: 3227575160 00:08:42.425 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:42.425 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:42.425 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:42.425 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.425 #2 INITED exec/s: 0 rss: 59Mb 00:08:42.425 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.425 This may also happen if the target rejected all inputs we tried so far 00:08:42.425 [2024-10-04 08:26:35.085354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f0000cc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.425 [2024-10-04 08:26:35.085384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.684 NEW_FUNC[1/670]: 0x454738 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:08:42.684 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:42.684 #5 NEW cov: 11580 ft: 11579 corp: 2/10b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 3 ChangeBit-ChangeByte-CMP- DE: "o\000\000\000\000\000\000\000"- 00:08:42.942 [2024-10-04 08:26:35.376014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f0000cc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.942 [2024-10-04 08:26:35.376053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.943 #6 NEW cov: 11697 ft: 12210 corp: 3/19b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:42.943 [2024-10-04 08:26:35.426030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:680000cc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.943 [2024-10-04 08:26:35.426057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.943 #7 NEW cov: 11703 ft: 12567 corp: 4/28b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:42.943 [2024-10-04 08:26:35.466274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:680000cc cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.943 [2024-10-04 08:26:35.466300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.943 [2024-10-04 08:26:35.466358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.943 [2024-10-04 08:26:35.466373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.943 #8 NEW cov: 11788 ft: 13234 corp: 5/45b lim: 35 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:42.943 [2024-10-04 08:26:35.506277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f0000cc cdw11:00000008 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.943 [2024-10-04 08:26:35.506303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.943 #14 NEW cov: 11788 ft: 13387 corp: 6/54b lim: 35 exec/s: 0 rss: 67Mb L: 9/17 MS: 1 ChangeBit- 00:08:42.943 [2024-10-04 08:26:35.546380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f0000cc cdw11:00000800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.943 [2024-10-04 08:26:35.546406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.943 #15 NEW cov: 11788 ft: 13596 corp: 7/63b lim: 35 exec/s: 0 rss: 67Mb L: 9/17 MS: 1 ChangeBit- 00:08:42.943 [2024-10-04 08:26:35.586651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:680000cc cdw11:240000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.943 [2024-10-04 08:26:35.586677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.943 [2024-10-04 08:26:35.586732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.943 [2024-10-04 08:26:35.586746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.943 #16 NEW cov: 11788 ft: 13645 corp: 8/80b lim: 35 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 ChangeByte- 00:08:43.201 [2024-10-04 08:26:35.627020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68ff00cc cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.627046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.201 [2024-10-04 08:26:35.627102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.627117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.201 [2024-10-04 08:26:35.627170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.627184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:43.201 [2024-10-04 08:26:35.627241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.627256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:43.201 #17 NEW cov: 11788 ft: 14290 corp: 9/112b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:43.201 [2024-10-04 08:26:35.666874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.666900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.201 [2024-10-04 08:26:35.666956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.666970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.201 #22 NEW cov: 11788 ft: 14301 corp: 10/127b lim: 35 exec/s: 0 rss: 67Mb L: 15/32 MS: 5 CopyPart-ChangeByte-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:43.201 [2024-10-04 08:26:35.707013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:680000cc cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.707040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.201 [2024-10-04 08:26:35.707099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.707117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.201 #23 NEW cov: 11788 ft: 14390 corp: 11/144b lim: 35 exec/s: 0 rss: 67Mb L: 17/32 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:43.201 [2024-10-04 08:26:35.747002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6fdd00cc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.747030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.201 #24 NEW cov: 11788 ft: 14401 corp: 12/153b lim: 35 exec/s: 0 rss: 67Mb L: 9/32 MS: 1 ChangeByte- 00:08:43.201 [2024-10-04 08:26:35.777081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6800002f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.201 [2024-10-04 08:26:35.777107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.201 #25 NEW cov: 11788 ft: 14433 corp: 13/162b lim: 35 exec/s: 0 rss: 67Mb L: 9/32 MS: 1 ChangeByte- 00:08:43.202 [2024-10-04 08:26:35.817161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.202 [2024-10-04 08:26:35.817192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.202 #26 NEW cov: 11788 ft: 14508 corp: 14/172b lim: 35 exec/s: 0 rss: 67Mb L: 10/32 MS: 1 EraseBytes- 00:08:43.202 [2024-10-04 08:26:35.857696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68ff00cc cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.202 [2024-10-04 08:26:35.857724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.202 [2024-10-04 08:26:35.857780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.202 [2024-10-04 08:26:35.857795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.202 [2024-10-04 08:26:35.857849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.202 [2024-10-04 08:26:35.857862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:43.202 [2024-10-04 08:26:35.857915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.202 [2024-10-04 08:26:35.857929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:43.460 #27 NEW cov: 11788 ft: 14550 corp: 15/204b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 ChangeByte- 00:08:43.460 [2024-10-04 08:26:35.907611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.460 [2024-10-04 08:26:35.907638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.461 [2024-10-04 08:26:35.907694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:35.907708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.461 #28 NEW cov: 11788 ft: 14556 corp: 16/219b lim: 35 exec/s: 0 rss: 68Mb L: 15/32 MS: 1 ShuffleBytes- 00:08:43.461 [2024-10-04 08:26:35.947537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f4000cc cdw11:00000008 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:35.947569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.461 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:43.461 #29 NEW cov: 11811 ft: 14577 corp: 17/228b lim: 35 exec/s: 0 rss: 68Mb L: 9/32 MS: 1 ChangeBit- 00:08:43.461 [2024-10-04 08:26:35.987788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:de7e00de cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:35.987814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.461 [2024-10-04 08:26:35.987870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:35.987884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.461 #30 NEW cov: 11811 ft: 14610 corp: 18/247b lim: 35 exec/s: 0 rss: 68Mb L: 19/32 MS: 1 CMP- DE: "~\000\000\000"- 00:08:43.461 [2024-10-04 08:26:36.027780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6800002f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:36.027806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.461 #31 NEW cov: 11811 ft: 14634 corp: 19/254b lim: 35 exec/s: 0 rss: 68Mb L: 7/32 MS: 1 EraseBytes- 00:08:43.461 [2024-10-04 08:26:36.067873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:680000cc cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:36.067899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.461 #32 NEW cov: 11811 ft: 14647 corp: 20/265b lim: 35 exec/s: 32 rss: 68Mb L: 11/32 MS: 1 EraseBytes- 00:08:43.461 [2024-10-04 08:26:36.108458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:434300cc cdw11:43004343 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:36.108485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.461 [2024-10-04 08:26:36.108541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:43430043 cdw11:43004343 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:36.108554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.461 [2024-10-04 08:26:36.108609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:43430043 cdw11:43004343 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:36.108623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:43.461 [2024-10-04 08:26:36.108680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:43430043 cdw11:dd00436f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.461 [2024-10-04 08:26:36.108694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:43.461 #33 NEW cov: 11811 ft: 14657 corp: 21/298b lim: 35 exec/s: 33 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:43.720 [2024-10-04 08:26:36.148319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.148346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.148402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00de cdw11:de00ff0d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.148419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.720 #34 NEW cov: 11811 ft: 14674 corp: 22/312b lim: 35 exec/s: 34 rss: 68Mb L: 14/33 MS: 1 CMP- DE: "\377\377\377\015"- 00:08:43.720 [2024-10-04 08:26:36.188664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68ff00cc cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.188691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.188746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.188761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.188819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.188833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.188889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff0700ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.188903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:43.720 #35 NEW cov: 11811 ft: 14684 corp: 23/344b lim: 35 exec/s: 35 rss: 68Mb L: 32/33 MS: 1 ChangeBinInt- 00:08:43.720 [2024-10-04 08:26:36.228436] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:43.720 [2024-10-04 08:26:36.228686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.228713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.228773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.228788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.228847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.228863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:43.720 #36 NEW cov: 11820 ft: 14867 corp: 24/367b lim: 35 exec/s: 36 rss: 68Mb L: 23/33 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:43.720 [2024-10-04 08:26:36.268506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f0000cc cdw11:0000083d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.268532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.720 #37 NEW cov: 11820 ft: 14885 corp: 25/377b lim: 35 exec/s: 37 rss: 68Mb L: 10/33 MS: 1 InsertByte- 00:08:43.720 [2024-10-04 08:26:36.309029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68ff00cc cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.309055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.309113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.309127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.309194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.309209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.309264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.309278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:43.720 #38 NEW cov: 11820 ft: 14897 corp: 26/409b lim: 35 exec/s: 38 rss: 68Mb L: 32/33 MS: 1 CrossOver- 00:08:43.720 [2024-10-04 08:26:36.348836] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:43.720 [2024-10-04 08:26:36.349158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68ff00cc cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.349184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.349248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:00006f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.349263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.349319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.349335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:43.720 [2024-10-04 08:26:36.349392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.349407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:43.720 #39 NEW cov: 11820 ft: 14944 corp: 27/441b lim: 35 exec/s: 39 rss: 68Mb L: 32/33 MS: 1 PersAutoDict- DE: "o\000\000\000\000\000\000\000"- 00:08:43.720 [2024-10-04 08:26:36.388836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f4000cc cdw11:00000008 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.720 [2024-10-04 08:26:36.388862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.980 #45 NEW cov: 11820 ft: 14964 corp: 28/450b lim: 35 exec/s: 45 rss: 68Mb L: 9/33 MS: 1 ShuffleBytes- 00:08:43.980 [2024-10-04 08:26:36.429090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.980 [2024-10-04 08:26:36.429116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.980 [2024-10-04 08:26:36.429170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:df00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.980 [2024-10-04 08:26:36.429184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.981 #46 NEW cov: 11820 ft: 14972 corp: 29/470b lim: 35 exec/s: 46 rss: 68Mb L: 20/33 MS: 1 InsertRepeatedBytes- 00:08:43.981 [2024-10-04 08:26:36.469234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de007bde SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.469260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.981 [2024-10-04 08:26:36.469316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.469334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.981 #47 NEW cov: 11820 ft: 14990 corp: 30/485b lim: 35 exec/s: 47 rss: 69Mb L: 15/33 MS: 1 ChangeByte- 00:08:43.981 [2024-10-04 08:26:36.509271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:90220034 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.509297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.981 #48 NEW cov: 11820 ft: 15006 corp: 31/494b lim: 35 exec/s: 48 rss: 69Mb L: 9/33 MS: 1 ChangeBinInt- 00:08:43.981 [2024-10-04 08:26:36.549431] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:43.981 [2024-10-04 08:26:36.549766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68ff00cc cdw11:6800ffcc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.549792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.981 [2024-10-04 08:26:36.549850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:6f00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.549864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.981 [2024-10-04 08:26:36.549919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.549935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:43.981 [2024-10-04 08:26:36.549989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.550003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:43.981 #49 NEW cov: 11820 ft: 15027 corp: 32/528b lim: 35 exec/s: 49 rss: 69Mb L: 34/34 MS: 1 CrossOver- 00:08:43.981 [2024-10-04 08:26:36.589603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de007b3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.589628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.981 [2024-10-04 08:26:36.589680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.589693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:43.981 #50 NEW cov: 11820 ft: 15033 corp: 33/543b lim: 35 exec/s: 50 rss: 69Mb L: 15/34 MS: 1 ChangeByte- 00:08:43.981 [2024-10-04 08:26:36.629586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:680000cc cdw11:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.981 [2024-10-04 08:26:36.629612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:43.981 #51 NEW cov: 11820 ft: 15071 corp: 34/552b lim: 35 exec/s: 51 rss: 69Mb L: 9/34 MS: 1 ChangeByte- 00:08:44.241 [2024-10-04 08:26:36.669517] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:44.241 [2024-10-04 08:26:36.669862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.669890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.241 [2024-10-04 08:26:36.669949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.669963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:44.241 #52 NEW cov: 11820 ft: 15114 corp: 35/571b lim: 35 exec/s: 52 rss: 69Mb L: 19/34 MS: 1 CopyPart- 00:08:44.241 [2024-10-04 08:26:36.709845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:683a00cc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.709870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.241 #53 NEW cov: 11820 ft: 15125 corp: 36/578b lim: 35 exec/s: 53 rss: 69Mb L: 7/34 MS: 1 EraseBytes- 00:08:44.241 [2024-10-04 08:26:36.750002] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:44.241 [2024-10-04 08:26:36.750233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.750259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.241 [2024-10-04 08:26:36.750317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.750333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:44.241 [2024-10-04 08:26:36.750390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.750405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:44.241 #54 NEW cov: 11820 ft: 15144 corp: 37/603b lim: 35 exec/s: 54 rss: 69Mb L: 25/34 MS: 1 CopyPart- 00:08:44.241 [2024-10-04 08:26:36.790463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:434300cc cdw11:43004343 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.790489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.241 [2024-10-04 08:26:36.790543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:43430043 cdw11:43004343 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.790557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:44.241 [2024-10-04 08:26:36.790611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:43430043 cdw11:43004343 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.790625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:44.241 [2024-10-04 08:26:36.790676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:43430043 cdw11:dd00436f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.790689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:44.241 #55 NEW cov: 11820 ft: 15156 corp: 38/636b lim: 35 exec/s: 55 rss: 69Mb L: 33/34 MS: 1 ShuffleBytes- 00:08:44.241 [2024-10-04 08:26:36.830197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6800002f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.830223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.241 #56 NEW cov: 11820 ft: 15185 corp: 39/649b lim: 35 exec/s: 56 rss: 69Mb L: 13/34 MS: 1 CrossOver- 00:08:44.241 [2024-10-04 08:26:36.870420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.870447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.241 [2024-10-04 08:26:36.870501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.870515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:44.241 #57 NEW cov: 11820 ft: 15192 corp: 40/664b lim: 35 exec/s: 57 rss: 69Mb L: 15/34 MS: 1 ShuffleBytes- 00:08:44.241 [2024-10-04 08:26:36.900374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f0000cc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.241 [2024-10-04 08:26:36.900399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.241 #58 NEW cov: 11820 ft: 15205 corp: 41/672b lim: 35 exec/s: 58 rss: 69Mb L: 8/34 MS: 1 EraseBytes- 00:08:44.501 [2024-10-04 08:26:36.940480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6f0000cb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.501 [2024-10-04 08:26:36.940506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.501 #59 NEW cov: 11820 ft: 15209 corp: 42/681b lim: 35 exec/s: 59 rss: 69Mb L: 9/34 MS: 1 ChangeBinInt- 00:08:44.501 [2024-10-04 08:26:36.980778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.501 [2024-10-04 08:26:36.980804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.501 [2024-10-04 08:26:36.980858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:deff00de cdw11:0d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.501 [2024-10-04 08:26:36.980871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:44.501 #60 NEW cov: 11820 ft: 15249 corp: 43/696b lim: 35 exec/s: 60 rss: 69Mb L: 15/34 MS: 1 PersAutoDict- DE: "\377\377\377\015"- 00:08:44.501 [2024-10-04 08:26:37.020590] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:44.501 [2024-10-04 08:26:37.020817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:cb000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.501 [2024-10-04 08:26:37.020844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.501 #61 NEW cov: 11820 ft: 15271 corp: 44/705b lim: 35 exec/s: 61 rss: 69Mb L: 9/34 MS: 1 ShuffleBytes- 00:08:44.501 [2024-10-04 08:26:37.060991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:680000cc cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.501 [2024-10-04 08:26:37.061017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:44.501 [2024-10-04 08:26:37.061074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:21ff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:44.501 [2024-10-04 08:26:37.061088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:44.501 #62 NEW cov: 11820 ft: 15277 corp: 45/722b lim: 35 exec/s: 31 rss: 69Mb L: 17/34 MS: 1 ChangeByte- 00:08:44.501 #62 DONE cov: 11820 ft: 15277 corp: 45/722b lim: 35 exec/s: 31 rss: 69Mb 00:08:44.501 ###### Recommended dictionary. ###### 00:08:44.501 "o\000\000\000\000\000\000\000" # Uses: 1 00:08:44.501 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:44.501 "~\000\000\000" # Uses: 0 00:08:44.501 "\377\377\377\015" # Uses: 1 00:08:44.501 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:44.501 ###### End of recommended dictionary. ###### 00:08:44.501 Done 62 runs in 2 second(s) 00:08:44.760 08:26:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:08:44.760 08:26:37 -- ../common.sh@72 -- # (( i++ )) 00:08:44.760 08:26:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.760 08:26:37 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:44.760 08:26:37 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:08:44.760 08:26:37 -- nvmf/run.sh@24 -- # local timen=1 00:08:44.760 08:26:37 -- nvmf/run.sh@25 -- # local core=0x1 00:08:44.760 08:26:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:44.760 08:26:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:08:44.760 08:26:37 -- nvmf/run.sh@29 -- # printf %02d 3 00:08:44.760 08:26:37 -- nvmf/run.sh@29 -- # port=4403 00:08:44.760 08:26:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:44.760 08:26:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:08:44.760 08:26:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:44.760 08:26:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:08:44.760 [2024-10-04 08:26:37.231135] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:44.760 [2024-10-04 08:26:37.231227] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1013308 ] 00:08:44.760 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.760 [2024-10-04 08:26:37.405512] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.760 [2024-10-04 08:26:37.424767] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:44.760 [2024-10-04 08:26:37.424891] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.019 [2024-10-04 08:26:37.476507] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.019 [2024-10-04 08:26:37.492867] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:08:45.019 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.019 INFO: Seed: 1394631282 00:08:45.019 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:45.019 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:45.019 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:45.019 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.019 #2 INITED exec/s: 0 rss: 60Mb 00:08:45.019 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.019 This may also happen if the target rejected all inputs we tried so far 00:08:45.278 NEW_FUNC[1/659]: 0x456418 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:08:45.278 NEW_FUNC[2/659]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:45.278 #5 NEW cov: 11504 ft: 11503 corp: 2/19b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 3 InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:45.278 #6 NEW cov: 11619 ft: 11998 corp: 3/37b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ChangeByte- 00:08:45.278 #7 NEW cov: 11625 ft: 12272 corp: 4/56b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertByte- 00:08:45.538 #8 NEW cov: 11710 ft: 12511 corp: 5/75b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 CrossOver- 00:08:45.538 #9 NEW cov: 11710 ft: 12608 corp: 6/93b lim: 20 exec/s: 0 rss: 67Mb L: 18/19 MS: 1 ChangeBit- 00:08:45.538 #10 NEW cov: 11710 ft: 12789 corp: 7/112b lim: 20 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 ChangeByte- 00:08:45.538 #11 NEW cov: 11710 ft: 12915 corp: 8/129b lim: 20 exec/s: 0 rss: 67Mb L: 17/19 MS: 1 EraseBytes- 00:08:45.538 #13 NEW cov: 11710 ft: 13035 corp: 9/145b lim: 20 exec/s: 0 rss: 67Mb L: 16/19 MS: 2 InsertByte-CrossOver- 00:08:45.538 #15 NEW cov: 11715 ft: 13424 corp: 10/154b lim: 20 exec/s: 0 rss: 67Mb L: 9/19 MS: 2 ChangeBit-CMP- DE: "\000h\340v\263gE\346"- 00:08:45.797 #16 NEW cov: 11715 ft: 13455 corp: 11/173b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 ChangeByte- 00:08:45.797 #17 NEW cov: 11715 ft: 13467 corp: 12/192b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 ChangeBit- 00:08:45.797 #18 NEW cov: 11715 ft: 13574 corp: 13/210b lim: 20 exec/s: 0 rss: 68Mb L: 18/19 MS: 1 ChangeBinInt- 00:08:45.797 #19 NEW cov: 11715 ft: 13667 corp: 14/220b lim: 20 exec/s: 0 rss: 68Mb L: 10/19 MS: 1 CrossOver- 00:08:45.797 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:45.797 #20 NEW cov: 11738 ft: 13789 corp: 15/240b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CrossOver- 00:08:46.056 #21 NEW cov: 11738 ft: 13810 corp: 16/259b lim: 20 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 ChangeBinInt- 00:08:46.057 #22 NEW cov: 11738 ft: 13859 corp: 17/278b lim: 20 exec/s: 22 rss: 68Mb L: 19/20 MS: 1 ShuffleBytes- 00:08:46.057 #23 NEW cov: 11738 ft: 13868 corp: 18/297b lim: 20 exec/s: 23 rss: 68Mb L: 19/20 MS: 1 InsertByte- 00:08:46.057 #24 NEW cov: 11738 ft: 13911 corp: 19/316b lim: 20 exec/s: 24 rss: 68Mb L: 19/20 MS: 1 ChangeBit- 00:08:46.057 #25 NEW cov: 11738 ft: 13918 corp: 20/335b lim: 20 exec/s: 25 rss: 68Mb L: 19/20 MS: 1 CopyPart- 00:08:46.057 #26 NEW cov: 11738 ft: 13972 corp: 21/355b lim: 20 exec/s: 26 rss: 68Mb L: 20/20 MS: 1 ChangeByte- 00:08:46.315 #27 NEW cov: 11738 ft: 13986 corp: 22/374b lim: 20 exec/s: 27 rss: 68Mb L: 19/20 MS: 1 ChangeBit- 00:08:46.315 #28 NEW cov: 11738 ft: 14014 corp: 23/384b lim: 20 exec/s: 28 rss: 68Mb L: 10/20 MS: 1 ChangeByte- 00:08:46.315 #29 NEW cov: 11738 ft: 14047 corp: 24/403b lim: 20 exec/s: 29 rss: 68Mb L: 19/20 MS: 1 ChangeByte- 00:08:46.315 #30 NEW cov: 11738 ft: 14069 corp: 25/422b lim: 20 exec/s: 30 rss: 68Mb L: 19/20 MS: 1 ChangeBinInt- 00:08:46.315 #31 NEW cov: 11738 ft: 14092 corp: 26/440b lim: 20 exec/s: 31 rss: 68Mb L: 18/20 MS: 1 ShuffleBytes- 00:08:46.574 #32 NEW cov: 11738 ft: 14129 corp: 27/458b lim: 20 exec/s: 32 rss: 68Mb L: 18/20 MS: 1 CrossOver- 00:08:46.574 #33 NEW cov: 11738 ft: 14151 corp: 28/474b lim: 20 exec/s: 33 rss: 69Mb L: 16/20 MS: 1 ChangeBit- 00:08:46.574 #34 NEW cov: 11738 ft: 14157 corp: 29/494b lim: 20 exec/s: 34 rss: 69Mb L: 20/20 MS: 1 PersAutoDict- DE: "\000h\340v\263gE\346"- 00:08:46.574 #35 NEW cov: 11738 ft: 14173 corp: 30/514b lim: 20 exec/s: 35 rss: 69Mb L: 20/20 MS: 1 PersAutoDict- DE: "\000h\340v\263gE\346"- 00:08:46.574 #36 NEW cov: 11738 ft: 14201 corp: 31/533b lim: 20 exec/s: 36 rss: 69Mb L: 19/20 MS: 1 ChangeByte- 00:08:46.574 #37 NEW cov: 11738 ft: 14218 corp: 32/543b lim: 20 exec/s: 37 rss: 69Mb L: 10/20 MS: 1 CrossOver- 00:08:46.833 #38 NEW cov: 11738 ft: 14230 corp: 33/562b lim: 20 exec/s: 38 rss: 69Mb L: 19/20 MS: 1 ChangeBit- 00:08:46.833 #39 NEW cov: 11738 ft: 14245 corp: 34/579b lim: 20 exec/s: 39 rss: 69Mb L: 17/20 MS: 1 EraseBytes- 00:08:46.833 #40 NEW cov: 11738 ft: 14249 corp: 35/598b lim: 20 exec/s: 40 rss: 69Mb L: 19/20 MS: 1 InsertByte- 00:08:46.833 #41 NEW cov: 11738 ft: 14320 corp: 36/608b lim: 20 exec/s: 41 rss: 69Mb L: 10/20 MS: 1 EraseBytes- 00:08:46.833 #42 NEW cov: 11738 ft: 14365 corp: 37/628b lim: 20 exec/s: 42 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:08:46.833 #43 NEW cov: 11738 ft: 14379 corp: 38/647b lim: 20 exec/s: 43 rss: 69Mb L: 19/20 MS: 1 ChangeBinInt- 00:08:47.092 #44 NEW cov: 11738 ft: 14401 corp: 39/657b lim: 20 exec/s: 22 rss: 69Mb L: 10/20 MS: 1 EraseBytes- 00:08:47.092 #44 DONE cov: 11738 ft: 14401 corp: 39/657b lim: 20 exec/s: 22 rss: 69Mb 00:08:47.092 ###### Recommended dictionary. ###### 00:08:47.092 "\000h\340v\263gE\346" # Uses: 2 00:08:47.092 ###### End of recommended dictionary. ###### 00:08:47.092 Done 44 runs in 2 second(s) 00:08:47.092 08:26:39 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:08:47.092 08:26:39 -- ../common.sh@72 -- # (( i++ )) 00:08:47.092 08:26:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.092 08:26:39 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:47.092 08:26:39 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:47.092 08:26:39 -- nvmf/run.sh@24 -- # local timen=1 00:08:47.092 08:26:39 -- nvmf/run.sh@25 -- # local core=0x1 00:08:47.092 08:26:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:47.092 08:26:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:47.092 08:26:39 -- nvmf/run.sh@29 -- # printf %02d 4 00:08:47.092 08:26:39 -- nvmf/run.sh@29 -- # port=4404 00:08:47.092 08:26:39 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:47.092 08:26:39 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:47.092 08:26:39 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:47.092 08:26:39 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:08:47.092 [2024-10-04 08:26:39.697342] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:47.092 [2024-10-04 08:26:39.697435] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1013840 ] 00:08:47.092 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.351 [2024-10-04 08:26:39.870877] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.351 [2024-10-04 08:26:39.889848] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:47.351 [2024-10-04 08:26:39.889968] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.351 [2024-10-04 08:26:39.941625] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:47.351 [2024-10-04 08:26:39.957982] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:47.351 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.351 INFO: Seed: 3859615573 00:08:47.351 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:47.351 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:47.351 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:47.351 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.351 #2 INITED exec/s: 0 rss: 59Mb 00:08:47.351 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.351 This may also happen if the target rejected all inputs we tried so far 00:08:47.351 [2024-10-04 08:26:40.003600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.351 [2024-10-04 08:26:40.003632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.351 [2024-10-04 08:26:40.003685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.351 [2024-10-04 08:26:40.003700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.351 [2024-10-04 08:26:40.003753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.351 [2024-10-04 08:26:40.003767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.351 [2024-10-04 08:26:40.003819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.351 [2024-10-04 08:26:40.003832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.873 NEW_FUNC[1/671]: 0x457518 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:47.873 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:47.873 #10 NEW cov: 11605 ft: 11606 corp: 2/33b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:47.873 [2024-10-04 08:26:40.324311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.873 [2024-10-04 08:26:40.324352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.873 [2024-10-04 08:26:40.324413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.873 [2024-10-04 08:26:40.324430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.873 [2024-10-04 08:26:40.324488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.873 [2024-10-04 08:26:40.324505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.873 #11 NEW cov: 11718 ft: 12409 corp: 3/60b lim: 35 exec/s: 0 rss: 67Mb L: 27/32 MS: 1 EraseBytes- 00:08:47.873 [2024-10-04 08:26:40.374341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.873 [2024-10-04 08:26:40.374369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.873 [2024-10-04 08:26:40.374424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cac4caca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.873 [2024-10-04 08:26:40.374438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.873 [2024-10-04 08:26:40.374492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.374506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.874 #17 NEW cov: 11724 ft: 12630 corp: 4/87b lim: 35 exec/s: 0 rss: 67Mb L: 27/32 MS: 1 ChangeBinInt- 00:08:47.874 [2024-10-04 08:26:40.414577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.414604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.414658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.414673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.414724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.414738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.414791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.414804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.874 #18 NEW cov: 11809 ft: 12928 corp: 5/117b lim: 35 exec/s: 0 rss: 67Mb L: 30/32 MS: 1 CopyPart- 00:08:47.874 [2024-10-04 08:26:40.454726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:24ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.454753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.454806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:c4ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.454820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.454873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.454888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.454939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.454952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.874 #19 NEW cov: 11809 ft: 13121 corp: 6/145b lim: 35 exec/s: 0 rss: 67Mb L: 28/32 MS: 1 InsertByte- 00:08:47.874 [2024-10-04 08:26:40.494680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.494707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.494764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.494777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.494831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.494846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.874 #20 NEW cov: 11809 ft: 13213 corp: 7/167b lim: 35 exec/s: 0 rss: 67Mb L: 22/32 MS: 1 EraseBytes- 00:08:47.874 [2024-10-04 08:26:40.534964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.534990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.535043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.535056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.535107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.535121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.874 [2024-10-04 08:26:40.535173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacac2ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.874 [2024-10-04 08:26:40.535190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.184 #21 NEW cov: 11809 ft: 13259 corp: 8/199b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 ChangeBit- 00:08:48.184 [2024-10-04 08:26:40.575020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.184 [2024-10-04 08:26:40.575053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.184 [2024-10-04 08:26:40.575122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:e6ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.184 [2024-10-04 08:26:40.575142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.184 [2024-10-04 08:26:40.575214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.184 [2024-10-04 08:26:40.575232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.184 [2024-10-04 08:26:40.575296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacac2 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.184 [2024-10-04 08:26:40.575317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.184 #22 NEW cov: 11809 ft: 13518 corp: 9/232b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 InsertByte- 00:08:48.184 [2024-10-04 08:26:40.625011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.184 [2024-10-04 08:26:40.625038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.625095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cac4caca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.625109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.625165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:8bca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.625179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.185 #23 NEW cov: 11809 ft: 13555 corp: 10/259b lim: 35 exec/s: 0 rss: 68Mb L: 27/33 MS: 1 ChangeByte- 00:08:48.185 [2024-10-04 08:26:40.665332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ca2432ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.665358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.665411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:c4ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.665426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.665479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.665493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.665544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.665556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.185 #24 NEW cov: 11809 ft: 13590 corp: 11/287b lim: 35 exec/s: 0 rss: 68Mb L: 28/33 MS: 1 ShuffleBytes- 00:08:48.185 [2024-10-04 08:26:40.705141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca0aca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.705167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.705228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacaca8b cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.705243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.185 #27 NEW cov: 11809 ft: 13890 corp: 12/304b lim: 35 exec/s: 0 rss: 68Mb L: 17/33 MS: 3 ShuffleBytes-ShuffleBytes-CrossOver- 00:08:48.185 [2024-10-04 08:26:40.745580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.745607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.745663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.745676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.745729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.745743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.745795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.745808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.185 #28 NEW cov: 11809 ft: 13947 corp: 13/336b lim: 35 exec/s: 0 rss: 68Mb L: 32/33 MS: 1 CopyPart- 00:08:48.185 [2024-10-04 08:26:40.785733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.785760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.785814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.785827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.785879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.785892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.785942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacac2ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.785955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.185 #29 NEW cov: 11809 ft: 14031 corp: 14/368b lim: 35 exec/s: 0 rss: 68Mb L: 32/33 MS: 1 ShuffleBytes- 00:08:48.185 [2024-10-04 08:26:40.825673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.825700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.825762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.825775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.185 [2024-10-04 08:26:40.825829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.185 [2024-10-04 08:26:40.825843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.185 #30 NEW cov: 11809 ft: 14068 corp: 15/389b lim: 35 exec/s: 0 rss: 68Mb L: 21/33 MS: 1 EraseBytes- 00:08:48.445 [2024-10-04 08:26:40.875865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.445 [2024-10-04 08:26:40.875893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.445 [2024-10-04 08:26:40.875948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.875962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:40.876013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:fff5feff cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.876027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.446 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:48.446 #31 NEW cov: 11832 ft: 14081 corp: 16/415b lim: 35 exec/s: 0 rss: 68Mb L: 26/33 MS: 1 CMP- DE: "\376\377\377\365"- 00:08:48.446 [2024-10-04 08:26:40.925994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.926021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:40.926075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacaca8a cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.926089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:40.926141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.926155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.446 #32 NEW cov: 11832 ft: 14118 corp: 17/442b lim: 35 exec/s: 0 rss: 68Mb L: 27/33 MS: 1 ChangeBit- 00:08:48.446 [2024-10-04 08:26:40.966246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:24ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.966272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:40.966327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:59595959 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.966340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:40.966394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacac4 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.966410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:40.966463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:40.966476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.446 #33 NEW cov: 11832 ft: 14150 corp: 18/474b lim: 35 exec/s: 0 rss: 68Mb L: 32/33 MS: 1 InsertRepeatedBytes- 00:08:48.446 [2024-10-04 08:26:41.006341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.006367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:41.006424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cac4caca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.006439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:41.006491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.006504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:41.006552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:8bcacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.006565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.446 #34 NEW cov: 11832 ft: 14166 corp: 19/506b lim: 35 exec/s: 34 rss: 68Mb L: 32/33 MS: 1 CrossOver- 00:08:48.446 [2024-10-04 08:26:41.046351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.046377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:41.046432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacabb8a cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.046446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:41.046497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.046510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.446 #35 NEW cov: 11832 ft: 14207 corp: 20/533b lim: 35 exec/s: 35 rss: 68Mb L: 27/33 MS: 1 ChangeByte- 00:08:48.446 [2024-10-04 08:26:41.096626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:24ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.096652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:41.096709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0c005959 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.096723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:41.096775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacac4 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.096791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.446 [2024-10-04 08:26:41.096842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.446 [2024-10-04 08:26:41.096855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.446 #36 NEW cov: 11832 ft: 14219 corp: 21/565b lim: 35 exec/s: 36 rss: 68Mb L: 32/33 MS: 1 CMP- DE: "\014\000\000\000"- 00:08:48.706 [2024-10-04 08:26:41.136730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ca2432ca cdw11:ca3f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.136755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.136807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:cac40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.136820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.136873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.136886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.136935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.136948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.706 #37 NEW cov: 11832 ft: 14232 corp: 22/594b lim: 35 exec/s: 37 rss: 68Mb L: 29/33 MS: 1 InsertByte- 00:08:48.706 [2024-10-04 08:26:41.176863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ca2432ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.176888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.176941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:c4ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.176955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.177007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cac8caca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.177020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.177073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.177086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.706 #38 NEW cov: 11832 ft: 14237 corp: 23/622b lim: 35 exec/s: 38 rss: 68Mb L: 28/33 MS: 1 ChangeBit- 00:08:48.706 [2024-10-04 08:26:41.216956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:24ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.216981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.217035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:59595959 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.217052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.217101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacac4 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.217115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.217166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.217179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.706 #39 NEW cov: 11832 ft: 14277 corp: 24/654b lim: 35 exec/s: 39 rss: 68Mb L: 32/33 MS: 1 CrossOver- 00:08:48.706 [2024-10-04 08:26:41.257095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:24ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.257120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.257175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:03005959 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.257192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.257243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacac4 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.706 [2024-10-04 08:26:41.257257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.706 [2024-10-04 08:26:41.257306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.257319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.707 #40 NEW cov: 11832 ft: 14285 corp: 25/686b lim: 35 exec/s: 40 rss: 68Mb L: 32/33 MS: 1 CMP- DE: "\003\000\000\000"- 00:08:48.707 [2024-10-04 08:26:41.297035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.297060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.707 [2024-10-04 08:26:41.297114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:caca63ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.297128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.707 [2024-10-04 08:26:41.297182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.297200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.707 #41 NEW cov: 11832 ft: 14298 corp: 26/708b lim: 35 exec/s: 41 rss: 69Mb L: 22/33 MS: 1 InsertByte- 00:08:48.707 [2024-10-04 08:26:41.337346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.337371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.707 [2024-10-04 08:26:41.337427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.337444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.707 [2024-10-04 08:26:41.337496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacabb8a cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.337510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.707 [2024-10-04 08:26:41.337564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.337577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.707 #42 NEW cov: 11832 ft: 14309 corp: 27/742b lim: 35 exec/s: 42 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:48.707 [2024-10-04 08:26:41.377481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ca2432ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.377506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.707 [2024-10-04 08:26:41.377559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:c4ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.377573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.707 [2024-10-04 08:26:41.377626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cac8caca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.377640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.707 [2024-10-04 08:26:41.377689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacaeaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.707 [2024-10-04 08:26:41.377702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.966 #43 NEW cov: 11832 ft: 14315 corp: 28/770b lim: 35 exec/s: 43 rss: 69Mb L: 28/34 MS: 1 ChangeBit- 00:08:48.966 [2024-10-04 08:26:41.417395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.966 [2024-10-04 08:26:41.417420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.966 [2024-10-04 08:26:41.417476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacaca8a cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.966 [2024-10-04 08:26:41.417491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.966 [2024-10-04 08:26:41.417543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.966 [2024-10-04 08:26:41.417557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.966 #44 NEW cov: 11832 ft: 14338 corp: 29/797b lim: 35 exec/s: 44 rss: 69Mb L: 27/34 MS: 1 ShuffleBytes- 00:08:48.966 [2024-10-04 08:26:41.457532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.966 [2024-10-04 08:26:41.457558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.966 [2024-10-04 08:26:41.457612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.966 [2024-10-04 08:26:41.457629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.966 [2024-10-04 08:26:41.457683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacaca41 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.966 [2024-10-04 08:26:41.457697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.966 #45 NEW cov: 11832 ft: 14431 corp: 30/820b lim: 35 exec/s: 45 rss: 69Mb L: 23/34 MS: 1 InsertByte- 00:08:48.967 [2024-10-04 08:26:41.497825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.497851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.497904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.497918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.497971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.497984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.498035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e8e8e8e8 cdw11:e8e80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.498047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.967 #46 NEW cov: 11832 ft: 14437 corp: 31/853b lim: 35 exec/s: 46 rss: 69Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:08:48.967 [2024-10-04 08:26:41.537929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:24ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.537954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.538007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0c005959 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.538021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.538072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:5959caca cdw11:0c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.538085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.538138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:caca00ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.538151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.967 #47 NEW cov: 11832 ft: 14446 corp: 32/885b lim: 35 exec/s: 47 rss: 69Mb L: 32/34 MS: 1 CopyPart- 00:08:48.967 [2024-10-04 08:26:41.577843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.577867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.577919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.577937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.577990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.578003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.967 #48 NEW cov: 11832 ft: 14458 corp: 33/909b lim: 35 exec/s: 48 rss: 69Mb L: 24/34 MS: 1 EraseBytes- 00:08:48.967 [2024-10-04 08:26:41.618124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ca2432ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.618149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.618207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:c4ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.618221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.618275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cac8caca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.618289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.967 [2024-10-04 08:26:41.618339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacaeac8 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.967 [2024-10-04 08:26:41.618352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.967 #49 NEW cov: 11832 ft: 14472 corp: 34/937b lim: 35 exec/s: 49 rss: 69Mb L: 28/34 MS: 1 ChangeBinInt- 00:08:49.227 [2024-10-04 08:26:41.657952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.657977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.658030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.658043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.227 #50 NEW cov: 11832 ft: 14485 corp: 35/951b lim: 35 exec/s: 50 rss: 69Mb L: 14/34 MS: 1 EraseBytes- 00:08:49.227 [2024-10-04 08:26:41.698052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.698077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.698131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacaca8a cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.698144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.227 #51 NEW cov: 11832 ft: 14498 corp: 36/970b lim: 35 exec/s: 51 rss: 69Mb L: 19/34 MS: 1 EraseBytes- 00:08:49.227 [2024-10-04 08:26:41.738162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.738193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.738267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacaca8a cdw11:32ca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.738285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.227 #52 NEW cov: 11832 ft: 14560 corp: 37/985b lim: 35 exec/s: 52 rss: 69Mb L: 15/34 MS: 1 CrossOver- 00:08:49.227 [2024-10-04 08:26:41.778402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.778427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.778480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cac4ca4a cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.778494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.778548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.778561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.227 #53 NEW cov: 11832 ft: 14574 corp: 38/1012b lim: 35 exec/s: 53 rss: 69Mb L: 27/34 MS: 1 ChangeBit- 00:08:49.227 [2024-10-04 08:26:41.818424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.818449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.818502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacaca8a cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.818514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.818567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.818581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.227 #54 NEW cov: 11832 ft: 14583 corp: 39/1039b lim: 35 exec/s: 54 rss: 69Mb L: 27/34 MS: 1 ShuffleBytes- 00:08:49.227 [2024-10-04 08:26:41.858655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.858681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.858736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:caca63ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.858750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.858804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:35353635 cdw11:35350000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.858817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.227 #55 NEW cov: 11832 ft: 14589 corp: 40/1061b lim: 35 exec/s: 55 rss: 69Mb L: 22/34 MS: 1 ChangeBinInt- 00:08:49.227 [2024-10-04 08:26:41.898951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:15f532ca cdw11:91660002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.898977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.899034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00cae068 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.899048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.899100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.899113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.227 [2024-10-04 08:26:41.899163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.227 [2024-10-04 08:26:41.899176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.487 #56 NEW cov: 11832 ft: 14635 corp: 41/1091b lim: 35 exec/s: 56 rss: 69Mb L: 30/34 MS: 1 CMP- DE: "\025\365\221fs\340h\000"- 00:08:49.487 [2024-10-04 08:26:41.939064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.487 [2024-10-04 08:26:41.939089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.487 [2024-10-04 08:26:41.939142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:ca590000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.487 [2024-10-04 08:26:41.939155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.487 [2024-10-04 08:26:41.939216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00ca0000 cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.487 [2024-10-04 08:26:41.939237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.487 [2024-10-04 08:26:41.939292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:caca41ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.487 [2024-10-04 08:26:41.939306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.487 #57 NEW cov: 11832 ft: 14677 corp: 42/1120b lim: 35 exec/s: 57 rss: 69Mb L: 29/34 MS: 1 CrossOver- 00:08:49.487 [2024-10-04 08:26:41.979175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:caca32ca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.487 [2024-10-04 08:26:41.979207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.487 [2024-10-04 08:26:41.979262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.487 [2024-10-04 08:26:41.979275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.487 [2024-10-04 08:26:41.979328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.487 [2024-10-04 08:26:41.979341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.487 [2024-10-04 08:26:41.979394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cacacaca cdw11:caca0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.487 [2024-10-04 08:26:41.979407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:49.487 #58 NEW cov: 11832 ft: 14707 corp: 43/1154b lim: 35 exec/s: 29 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:08:49.487 #58 DONE cov: 11832 ft: 14707 corp: 43/1154b lim: 35 exec/s: 29 rss: 69Mb 00:08:49.487 ###### Recommended dictionary. ###### 00:08:49.487 "\376\377\377\365" # Uses: 0 00:08:49.487 "\014\000\000\000" # Uses: 0 00:08:49.487 "\003\000\000\000" # Uses: 0 00:08:49.487 "\025\365\221fs\340h\000" # Uses: 0 00:08:49.487 ###### End of recommended dictionary. ###### 00:08:49.487 Done 58 runs in 2 second(s) 00:08:49.487 08:26:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:08:49.487 08:26:42 -- ../common.sh@72 -- # (( i++ )) 00:08:49.487 08:26:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.487 08:26:42 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:49.487 08:26:42 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:49.487 08:26:42 -- nvmf/run.sh@24 -- # local timen=1 00:08:49.487 08:26:42 -- nvmf/run.sh@25 -- # local core=0x1 00:08:49.487 08:26:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:49.487 08:26:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:49.487 08:26:42 -- nvmf/run.sh@29 -- # printf %02d 5 00:08:49.487 08:26:42 -- nvmf/run.sh@29 -- # port=4405 00:08:49.487 08:26:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:49.487 08:26:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:49.487 08:26:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:49.487 08:26:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:08:49.487 [2024-10-04 08:26:42.155888] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:49.488 [2024-10-04 08:26:42.155985] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1014272 ] 00:08:49.747 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.747 [2024-10-04 08:26:42.335038] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.747 [2024-10-04 08:26:42.354806] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:49.747 [2024-10-04 08:26:42.354922] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.747 [2024-10-04 08:26:42.406269] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:49.747 [2024-10-04 08:26:42.422617] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:50.007 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.007 INFO: Seed: 2028643568 00:08:50.007 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:50.007 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:50.007 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:50.007 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.007 #2 INITED exec/s: 0 rss: 59Mb 00:08:50.007 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.007 This may also happen if the target rejected all inputs we tried so far 00:08:50.007 [2024-10-04 08:26:42.471884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.007 [2024-10-04 08:26:42.471918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.266 NEW_FUNC[1/670]: 0x4596b8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:50.266 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:50.266 #38 NEW cov: 11596 ft: 11615 corp: 2/10b lim: 45 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:50.266 [2024-10-04 08:26:42.783126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff4aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.266 [2024-10-04 08:26:42.783160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.266 NEW_FUNC[1/1]: 0x19613b8 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:08:50.266 #40 NEW cov: 11729 ft: 12132 corp: 3/20b lim: 45 exec/s: 0 rss: 67Mb L: 10/10 MS: 2 ChangeBit-CrossOver- 00:08:50.266 [2024-10-04 08:26:42.833211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff35ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.266 [2024-10-04 08:26:42.833238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.266 #43 NEW cov: 11735 ft: 12274 corp: 4/29b lim: 45 exec/s: 0 rss: 67Mb L: 9/10 MS: 3 CrossOver-CopyPart-InsertByte- 00:08:50.266 [2024-10-04 08:26:42.873366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.266 [2024-10-04 08:26:42.873391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.266 #47 NEW cov: 11820 ft: 12650 corp: 5/39b lim: 45 exec/s: 0 rss: 67Mb L: 10/10 MS: 4 ShuffleBytes-InsertByte-ChangeBinInt-PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:50.266 [2024-10-04 08:26:42.913068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff5dffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.266 [2024-10-04 08:26:42.913096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.266 #56 NEW cov: 11820 ft: 12709 corp: 6/53b lim: 45 exec/s: 0 rss: 67Mb L: 14/14 MS: 4 EraseBytes-ChangeByte-CopyPart-CrossOver- 00:08:50.525 [2024-10-04 08:26:42.953303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30ffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.525 [2024-10-04 08:26:42.953331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.525 #57 NEW cov: 11820 ft: 12869 corp: 7/62b lim: 45 exec/s: 0 rss: 67Mb L: 9/14 MS: 1 ChangeByte- 00:08:50.525 [2024-10-04 08:26:43.003789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.525 [2024-10-04 08:26:43.003815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.526 #58 NEW cov: 11820 ft: 13003 corp: 8/76b lim: 45 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:50.526 [2024-10-04 08:26:43.053880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff4aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.526 [2024-10-04 08:26:43.053908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.526 #59 NEW cov: 11820 ft: 13022 corp: 9/86b lim: 45 exec/s: 0 rss: 67Mb L: 10/14 MS: 1 ShuffleBytes- 00:08:50.526 [2024-10-04 08:26:43.114466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.526 [2024-10-04 08:26:43.114493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.526 [2024-10-04 08:26:43.114639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.526 [2024-10-04 08:26:43.114655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.526 #60 NEW cov: 11820 ft: 13819 corp: 10/104b lim: 45 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 CopyPart- 00:08:50.526 [2024-10-04 08:26:43.164253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff5df1ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.526 [2024-10-04 08:26:43.164279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.526 #61 NEW cov: 11820 ft: 13846 corp: 11/118b lim: 45 exec/s: 0 rss: 67Mb L: 14/18 MS: 1 ChangeByte- 00:08:50.785 [2024-10-04 08:26:43.214773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30ffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.785 [2024-10-04 08:26:43.214800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.785 [2024-10-04 08:26:43.214934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ff0affff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.785 [2024-10-04 08:26:43.214952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.785 #62 NEW cov: 11820 ft: 13938 corp: 12/136b lim: 45 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 CopyPart- 00:08:50.785 [2024-10-04 08:26:43.275271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30ffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.785 [2024-10-04 08:26:43.275299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.785 [2024-10-04 08:26:43.275414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ff0affff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.785 [2024-10-04 08:26:43.275429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.785 [2024-10-04 08:26:43.275555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.785 [2024-10-04 08:26:43.275572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.785 #63 NEW cov: 11820 ft: 14269 corp: 13/168b lim: 45 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:50.785 [2024-10-04 08:26:43.334781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff5df1ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.785 [2024-10-04 08:26:43.334808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.785 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:50.785 #64 NEW cov: 11843 ft: 14302 corp: 14/182b lim: 45 exec/s: 0 rss: 67Mb L: 14/32 MS: 1 ChangeASCIIInt- 00:08:50.785 [2024-10-04 08:26:43.385004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.785 [2024-10-04 08:26:43.385032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.785 #65 NEW cov: 11843 ft: 14339 corp: 15/195b lim: 45 exec/s: 0 rss: 68Mb L: 13/32 MS: 1 CMP- DE: "\377\377\377\225"- 00:08:50.785 [2024-10-04 08:26:43.435099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.785 [2024-10-04 08:26:43.435127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.785 #66 NEW cov: 11843 ft: 14461 corp: 16/205b lim: 45 exec/s: 66 rss: 68Mb L: 10/32 MS: 1 ChangeBinInt- 00:08:51.045 [2024-10-04 08:26:43.495408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff4aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.045 [2024-10-04 08:26:43.495439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.045 #67 NEW cov: 11843 ft: 14486 corp: 17/220b lim: 45 exec/s: 67 rss: 68Mb L: 15/32 MS: 1 InsertRepeatedBytes- 00:08:51.045 [2024-10-04 08:26:43.545518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff4aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.045 [2024-10-04 08:26:43.545545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.045 #68 NEW cov: 11843 ft: 14515 corp: 18/235b lim: 45 exec/s: 68 rss: 68Mb L: 15/32 MS: 1 ChangeBinInt- 00:08:51.045 [2024-10-04 08:26:43.595696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff9cff cdw11:ff950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.045 [2024-10-04 08:26:43.595723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.045 #69 NEW cov: 11843 ft: 14527 corp: 19/248b lim: 45 exec/s: 69 rss: 68Mb L: 13/32 MS: 1 ChangeByte- 00:08:51.045 [2024-10-04 08:26:43.645873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.045 [2024-10-04 08:26:43.645903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.045 #70 NEW cov: 11843 ft: 14654 corp: 20/258b lim: 45 exec/s: 70 rss: 68Mb L: 10/32 MS: 1 ChangeBinInt- 00:08:51.045 [2024-10-04 08:26:43.695943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff484aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.045 [2024-10-04 08:26:43.695971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.045 #71 NEW cov: 11843 ft: 14671 corp: 21/273b lim: 45 exec/s: 71 rss: 68Mb L: 15/32 MS: 1 ChangeByte- 00:08:51.305 [2024-10-04 08:26:43.756232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ff860007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.305 [2024-10-04 08:26:43.756259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.305 #72 NEW cov: 11843 ft: 14694 corp: 22/283b lim: 45 exec/s: 72 rss: 68Mb L: 10/32 MS: 1 ChangeByte- 00:08:51.305 [2024-10-04 08:26:43.816401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000b600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.305 [2024-10-04 08:26:43.816427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.305 #73 NEW cov: 11843 ft: 14774 corp: 23/293b lim: 45 exec/s: 73 rss: 68Mb L: 10/32 MS: 1 ChangeBinInt- 00:08:51.305 [2024-10-04 08:26:43.876560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff4aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.305 [2024-10-04 08:26:43.876586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.305 #74 NEW cov: 11843 ft: 14842 corp: 24/303b lim: 45 exec/s: 74 rss: 68Mb L: 10/32 MS: 1 ShuffleBytes- 00:08:51.305 [2024-10-04 08:26:43.926662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff4aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.305 [2024-10-04 08:26:43.926689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.305 #75 NEW cov: 11843 ft: 14906 corp: 25/313b lim: 45 exec/s: 75 rss: 68Mb L: 10/32 MS: 1 ChangeByte- 00:08:51.305 [2024-10-04 08:26:43.976794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff494aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.305 [2024-10-04 08:26:43.976825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.564 #76 NEW cov: 11843 ft: 14912 corp: 26/328b lim: 45 exec/s: 76 rss: 68Mb L: 15/32 MS: 1 ChangeBit- 00:08:51.564 [2024-10-04 08:26:44.027067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff5df1ff cdw11:ff340007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.564 [2024-10-04 08:26:44.027093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.564 #77 NEW cov: 11843 ft: 14936 corp: 27/342b lim: 45 exec/s: 77 rss: 69Mb L: 14/32 MS: 1 ShuffleBytes- 00:08:51.564 [2024-10-04 08:26:44.077150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff5dffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.564 [2024-10-04 08:26:44.077178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.564 #78 NEW cov: 11843 ft: 14941 corp: 28/356b lim: 45 exec/s: 78 rss: 69Mb L: 14/32 MS: 1 ShuffleBytes- 00:08:51.564 [2024-10-04 08:26:44.117144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff5df1ff cdw11:fdff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.564 [2024-10-04 08:26:44.117171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.564 #79 NEW cov: 11843 ft: 14991 corp: 29/370b lim: 45 exec/s: 79 rss: 69Mb L: 14/32 MS: 1 ChangeBit- 00:08:51.564 [2024-10-04 08:26:44.167382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff4aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.564 [2024-10-04 08:26:44.167407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.564 #80 NEW cov: 11843 ft: 15008 corp: 30/381b lim: 45 exec/s: 80 rss: 69Mb L: 11/32 MS: 1 InsertByte- 00:08:51.564 [2024-10-04 08:26:44.218144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30ffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.564 [2024-10-04 08:26:44.218172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.564 [2024-10-04 08:26:44.218291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ff0a0006 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.564 [2024-10-04 08:26:44.218309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.564 [2024-10-04 08:26:44.218423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.564 [2024-10-04 08:26:44.218438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.823 #81 NEW cov: 11843 ft: 15015 corp: 31/413b lim: 45 exec/s: 81 rss: 69Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:51.823 [2024-10-04 08:26:44.278046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000b600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.823 [2024-10-04 08:26:44.278072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.823 [2024-10-04 08:26:44.278195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:acac0aac cdw11:acac0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.823 [2024-10-04 08:26:44.278213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.823 #82 NEW cov: 11843 ft: 15026 corp: 32/439b lim: 45 exec/s: 82 rss: 69Mb L: 26/32 MS: 1 InsertRepeatedBytes- 00:08:51.823 [2024-10-04 08:26:44.338290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.823 [2024-10-04 08:26:44.338321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.823 [2024-10-04 08:26:44.338442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.823 [2024-10-04 08:26:44.338459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.823 #83 NEW cov: 11843 ft: 15033 corp: 33/457b lim: 45 exec/s: 83 rss: 69Mb L: 18/32 MS: 1 CopyPart- 00:08:51.823 [2024-10-04 08:26:44.398211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff4aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.823 [2024-10-04 08:26:44.398239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.823 #84 NEW cov: 11843 ft: 15053 corp: 34/467b lim: 45 exec/s: 84 rss: 69Mb L: 10/32 MS: 1 ShuffleBytes- 00:08:51.823 [2024-10-04 08:26:44.448367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fe5dffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.823 [2024-10-04 08:26:44.448393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.823 #90 NEW cov: 11843 ft: 15057 corp: 35/481b lim: 45 exec/s: 45 rss: 69Mb L: 14/32 MS: 1 ChangeBit- 00:08:51.823 #90 DONE cov: 11843 ft: 15057 corp: 35/481b lim: 45 exec/s: 45 rss: 69Mb 00:08:51.823 ###### Recommended dictionary. ###### 00:08:51.823 "\377\377\377\377\377\377\377\377" # Uses: 2 00:08:51.823 "\377\377\377\225" # Uses: 0 00:08:51.823 ###### End of recommended dictionary. ###### 00:08:51.823 Done 90 runs in 2 second(s) 00:08:52.082 08:26:44 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:08:52.082 08:26:44 -- ../common.sh@72 -- # (( i++ )) 00:08:52.082 08:26:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.082 08:26:44 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:52.082 08:26:44 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:52.082 08:26:44 -- nvmf/run.sh@24 -- # local timen=1 00:08:52.082 08:26:44 -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.082 08:26:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:52.082 08:26:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:52.082 08:26:44 -- nvmf/run.sh@29 -- # printf %02d 6 00:08:52.082 08:26:44 -- nvmf/run.sh@29 -- # port=4406 00:08:52.082 08:26:44 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:52.082 08:26:44 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:52.082 08:26:44 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:52.082 08:26:44 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:08:52.082 [2024-10-04 08:26:44.624119] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:52.082 [2024-10-04 08:26:44.624218] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1014674 ] 00:08:52.082 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.342 [2024-10-04 08:26:44.802880] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.342 [2024-10-04 08:26:44.822072] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:52.342 [2024-10-04 08:26:44.822193] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.342 [2024-10-04 08:26:44.873488] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:52.342 [2024-10-04 08:26:44.889804] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:52.342 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.342 INFO: Seed: 202682861 00:08:52.342 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:52.342 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:52.342 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:52.342 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.342 #2 INITED exec/s: 0 rss: 59Mb 00:08:52.342 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.342 This may also happen if the target rejected all inputs we tried so far 00:08:52.342 [2024-10-04 08:26:44.955735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:52.342 [2024-10-04 08:26:44.955770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.602 NEW_FUNC[1/669]: 0x45bec8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:52.602 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:52.602 #3 NEW cov: 11526 ft: 11527 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:08:52.860 [2024-10-04 08:26:45.286557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:52.860 [2024-10-04 08:26:45.286603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.860 #4 NEW cov: 11646 ft: 12195 corp: 3/5b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:08:52.860 [2024-10-04 08:26:45.336620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:52.860 [2024-10-04 08:26:45.336649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.860 #5 NEW cov: 11652 ft: 12511 corp: 4/7b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ShuffleBytes- 00:08:52.860 [2024-10-04 08:26:45.376748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:52.861 [2024-10-04 08:26:45.376775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.861 #6 NEW cov: 11737 ft: 12739 corp: 5/9b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:08:52.861 [2024-10-04 08:26:45.426822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ecf5 cdw11:00000000 00:08:52.861 [2024-10-04 08:26:45.426848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.861 #7 NEW cov: 11737 ft: 12878 corp: 6/11b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeBinInt- 00:08:52.861 [2024-10-04 08:26:45.467580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:52.861 [2024-10-04 08:26:45.467607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.861 [2024-10-04 08:26:45.467717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:52.861 [2024-10-04 08:26:45.467734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:52.861 [2024-10-04 08:26:45.467836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:52.861 [2024-10-04 08:26:45.467852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:52.861 [2024-10-04 08:26:45.467968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:52.861 [2024-10-04 08:26:45.467984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:52.861 #8 NEW cov: 11737 ft: 13217 corp: 7/20b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:52.861 [2024-10-04 08:26:45.507111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a72 cdw11:00000000 00:08:52.861 [2024-10-04 08:26:45.507138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:52.861 #9 NEW cov: 11737 ft: 13280 corp: 8/23b lim: 10 exec/s: 0 rss: 67Mb L: 3/9 MS: 1 InsertByte- 00:08:53.119 [2024-10-04 08:26:45.547377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.119 [2024-10-04 08:26:45.547404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.119 [2024-10-04 08:26:45.547513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.119 [2024-10-04 08:26:45.547529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.119 #10 NEW cov: 11737 ft: 13488 corp: 9/27b lim: 10 exec/s: 0 rss: 67Mb L: 4/9 MS: 1 CopyPart- 00:08:53.119 [2024-10-04 08:26:45.587847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.587874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.587992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.588009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.588121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.588136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.588248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.588263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:53.120 #11 NEW cov: 11737 ft: 13529 corp: 10/36b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:53.120 [2024-10-04 08:26:45.627622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.627648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.627759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.627776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.120 #12 NEW cov: 11737 ft: 13604 corp: 11/40b lim: 10 exec/s: 0 rss: 67Mb L: 4/9 MS: 1 ChangeBinInt- 00:08:53.120 [2024-10-04 08:26:45.668380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.668405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.668509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.668529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.668642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.668658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.668773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.668787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.668891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000d972 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.668905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:53.120 #13 NEW cov: 11737 ft: 13659 corp: 12/50b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:08:53.120 [2024-10-04 08:26:45.708226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.708252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.708366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.708382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.708498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.708513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.708627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.708642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:53.120 #14 NEW cov: 11737 ft: 13771 corp: 13/59b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 ShuffleBytes- 00:08:53.120 [2024-10-04 08:26:45.748531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004ad9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.748557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.748667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.748682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.748796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.748812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.748924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.748940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:53.120 [2024-10-04 08:26:45.749048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000d972 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.749064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:53.120 #15 NEW cov: 11737 ft: 13786 corp: 14/69b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:08:53.120 [2024-10-04 08:26:45.787941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ecf5 cdw11:00000000 00:08:53.120 [2024-10-04 08:26:45.787967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.378 #16 NEW cov: 11737 ft: 13793 corp: 15/71b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:53.378 [2024-10-04 08:26:45.828056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ece5 cdw11:00000000 00:08:53.378 [2024-10-04 08:26:45.828083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.378 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:53.378 #17 NEW cov: 11760 ft: 13847 corp: 16/73b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ChangeBit- 00:08:53.378 [2024-10-04 08:26:45.868552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:53.378 [2024-10-04 08:26:45.868578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.378 [2024-10-04 08:26:45.868692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.378 [2024-10-04 08:26:45.868709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.378 [2024-10-04 08:26:45.868830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.378 [2024-10-04 08:26:45.868847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.378 #18 NEW cov: 11760 ft: 13984 corp: 17/80b lim: 10 exec/s: 0 rss: 68Mb L: 7/10 MS: 1 EraseBytes- 00:08:53.378 [2024-10-04 08:26:45.908274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a72 cdw11:00000000 00:08:53.378 [2024-10-04 08:26:45.908301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.379 #20 NEW cov: 11760 ft: 14007 corp: 18/83b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 2 ShuffleBytes-CrossOver- 00:08:53.379 [2024-10-04 08:26:45.948400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004be5 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:45.948426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.379 #21 NEW cov: 11760 ft: 14038 corp: 19/85b lim: 10 exec/s: 21 rss: 68Mb L: 2/10 MS: 1 ChangeByte- 00:08:53.379 [2024-10-04 08:26:45.989113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:45.989140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.379 [2024-10-04 08:26:45.989262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:45.989279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.379 [2024-10-04 08:26:45.989391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:45.989408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.379 [2024-10-04 08:26:45.989526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000010d9 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:45.989542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:53.379 #22 NEW cov: 11760 ft: 14052 corp: 20/94b lim: 10 exec/s: 22 rss: 68Mb L: 9/10 MS: 1 ChangeByte- 00:08:53.379 [2024-10-04 08:26:46.029466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:46.029493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.379 [2024-10-04 08:26:46.029604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d927 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:46.029621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.379 [2024-10-04 08:26:46.029728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:46.029742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.379 [2024-10-04 08:26:46.029856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:46.029873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:53.379 [2024-10-04 08:26:46.029989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.379 [2024-10-04 08:26:46.030006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:53.379 #23 NEW cov: 11760 ft: 14060 corp: 21/104b lim: 10 exec/s: 23 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:08:53.637 [2024-10-04 08:26:46.068808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.637 [2024-10-04 08:26:46.068835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.637 #24 NEW cov: 11760 ft: 14093 corp: 22/107b lim: 10 exec/s: 24 rss: 68Mb L: 3/10 MS: 1 InsertByte- 00:08:53.637 [2024-10-04 08:26:46.108881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.637 [2024-10-04 08:26:46.108907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.637 #25 NEW cov: 11760 ft: 14107 corp: 23/110b lim: 10 exec/s: 25 rss: 68Mb L: 3/10 MS: 1 CopyPart- 00:08:53.637 [2024-10-04 08:26:46.149582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a72 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.149610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.638 [2024-10-04 08:26:46.149726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.149741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.638 [2024-10-04 08:26:46.149857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.149871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.638 [2024-10-04 08:26:46.149985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.150003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:53.638 #26 NEW cov: 11760 ft: 14119 corp: 24/119b lim: 10 exec/s: 26 rss: 68Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:08:53.638 [2024-10-04 08:26:46.199237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ecfd cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.199264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.638 #27 NEW cov: 11760 ft: 14126 corp: 25/121b lim: 10 exec/s: 27 rss: 68Mb L: 2/10 MS: 1 ChangeBit- 00:08:53.638 [2024-10-04 08:26:46.239318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ebf5 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.239344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.638 #28 NEW cov: 11760 ft: 14132 corp: 26/123b lim: 10 exec/s: 28 rss: 69Mb L: 2/10 MS: 1 ChangeBinInt- 00:08:53.638 [2024-10-04 08:26:46.280208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.280251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.638 [2024-10-04 08:26:46.280384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.280400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.638 [2024-10-04 08:26:46.280511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d9f9 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.280526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.638 [2024-10-04 08:26:46.280637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.280654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:53.638 [2024-10-04 08:26:46.280762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000d972 cdw11:00000000 00:08:53.638 [2024-10-04 08:26:46.280778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:53.638 #29 NEW cov: 11760 ft: 14216 corp: 27/133b lim: 10 exec/s: 29 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:08:53.896 [2024-10-04 08:26:46.319595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ebf5 cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.319622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.896 #30 NEW cov: 11760 ft: 14231 corp: 28/135b lim: 10 exec/s: 30 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:53.896 [2024-10-04 08:26:46.359906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.359932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.896 [2024-10-04 08:26:46.360042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.360058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.896 #31 NEW cov: 11760 ft: 14241 corp: 29/139b lim: 10 exec/s: 31 rss: 69Mb L: 4/10 MS: 1 CopyPart- 00:08:53.896 [2024-10-04 08:26:46.399765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ad0a cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.399792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.896 #33 NEW cov: 11760 ft: 14252 corp: 30/141b lim: 10 exec/s: 33 rss: 69Mb L: 2/10 MS: 2 EraseBytes-InsertByte- 00:08:53.896 [2024-10-04 08:26:46.440275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.440303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.896 [2024-10-04 08:26:46.440417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.440438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.896 [2024-10-04 08:26:46.440557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.440573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:53.896 #34 NEW cov: 11760 ft: 14255 corp: 31/148b lim: 10 exec/s: 34 rss: 69Mb L: 7/10 MS: 1 ShuffleBytes- 00:08:53.896 [2024-10-04 08:26:46.490060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004be5 cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.490089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.896 #35 NEW cov: 11760 ft: 14265 corp: 32/150b lim: 10 exec/s: 35 rss: 69Mb L: 2/10 MS: 1 CopyPart- 00:08:53.896 [2024-10-04 08:26:46.530172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ac0a cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.530211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.896 #36 NEW cov: 11760 ft: 14280 corp: 33/152b lim: 10 exec/s: 36 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:08:53.896 [2024-10-04 08:26:46.570313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001d0a cdw11:00000000 00:08:53.896 [2024-10-04 08:26:46.570339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.222 #37 NEW cov: 11760 ft: 14289 corp: 34/154b lim: 10 exec/s: 37 rss: 69Mb L: 2/10 MS: 1 ChangeBinInt- 00:08:54.222 [2024-10-04 08:26:46.610969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.610996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.611110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.611143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.611253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003bd9 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.611272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.611387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000010d9 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.611403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.222 #38 NEW cov: 11760 ft: 14292 corp: 35/163b lim: 10 exec/s: 38 rss: 69Mb L: 9/10 MS: 1 ChangeByte- 00:08:54.222 [2024-10-04 08:26:46.660578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ece3 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.660606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.222 #39 NEW cov: 11760 ft: 14302 corp: 36/165b lim: 10 exec/s: 39 rss: 69Mb L: 2/10 MS: 1 ChangeBinInt- 00:08:54.222 [2024-10-04 08:26:46.701320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.701346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.701462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.701480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.701607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a5b cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.701622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.701741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.701759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.222 #40 NEW cov: 11760 ft: 14323 corp: 37/173b lim: 10 exec/s: 40 rss: 69Mb L: 8/10 MS: 1 CrossOver- 00:08:54.222 [2024-10-04 08:26:46.741098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.741126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.741241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a72 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.741256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.222 #41 NEW cov: 11760 ft: 14331 corp: 38/178b lim: 10 exec/s: 41 rss: 69Mb L: 5/10 MS: 1 CrossOver- 00:08:54.222 [2024-10-04 08:26:46.780907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001ded cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.780933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.222 #42 NEW cov: 11760 ft: 14350 corp: 39/180b lim: 10 exec/s: 42 rss: 69Mb L: 2/10 MS: 1 ChangeBinInt- 00:08:54.222 [2024-10-04 08:26:46.821033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000037e5 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.821058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.222 #45 NEW cov: 11760 ft: 14371 corp: 40/182b lim: 10 exec/s: 45 rss: 69Mb L: 2/10 MS: 3 EraseBytes-ShuffleBytes-InsertByte- 00:08:54.222 [2024-10-04 08:26:46.861951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.861977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.862088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.862103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.862232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a5b cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.862247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.862375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d972 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.862391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.222 [2024-10-04 08:26:46.862505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000ad9 cdw11:00000000 00:08:54.222 [2024-10-04 08:26:46.862519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:54.481 [2024-10-04 08:26:46.912059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.481 [2024-10-04 08:26:46.912089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.481 [2024-10-04 08:26:46.912202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:54.481 [2024-10-04 08:26:46.912218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.481 [2024-10-04 08:26:46.912331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a5b cdw11:00000000 00:08:54.481 [2024-10-04 08:26:46.912350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.481 [2024-10-04 08:26:46.912466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d972 cdw11:00000000 00:08:54.481 [2024-10-04 08:26:46.912483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.481 [2024-10-04 08:26:46.912593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:000072d9 cdw11:00000000 00:08:54.481 [2024-10-04 08:26:46.912609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:54.481 #47 NEW cov: 11760 ft: 14381 corp: 41/192b lim: 10 exec/s: 47 rss: 70Mb L: 10/10 MS: 2 CrossOver-CrossOver- 00:08:54.481 [2024-10-04 08:26:46.951548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:08:54.481 [2024-10-04 08:26:46.951574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.481 [2024-10-04 08:26:46.951685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.481 [2024-10-04 08:26:46.951701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.481 #48 NEW cov: 11760 ft: 14392 corp: 42/196b lim: 10 exec/s: 24 rss: 70Mb L: 4/10 MS: 1 ChangeBit- 00:08:54.481 #48 DONE cov: 11760 ft: 14392 corp: 42/196b lim: 10 exec/s: 24 rss: 70Mb 00:08:54.481 Done 48 runs in 2 second(s) 00:08:54.481 08:26:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:08:54.481 08:26:47 -- ../common.sh@72 -- # (( i++ )) 00:08:54.481 08:26:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:54.481 08:26:47 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:54.481 08:26:47 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:54.481 08:26:47 -- nvmf/run.sh@24 -- # local timen=1 00:08:54.481 08:26:47 -- nvmf/run.sh@25 -- # local core=0x1 00:08:54.481 08:26:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:54.481 08:26:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:54.481 08:26:47 -- nvmf/run.sh@29 -- # printf %02d 7 00:08:54.481 08:26:47 -- nvmf/run.sh@29 -- # port=4407 00:08:54.481 08:26:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:54.481 08:26:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:54.481 08:26:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:54.481 08:26:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:08:54.481 [2024-10-04 08:26:47.133348] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:54.481 [2024-10-04 08:26:47.133417] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1015211 ] 00:08:54.740 EAL: No free 2048 kB hugepages reported on node 1 00:08:54.740 [2024-10-04 08:26:47.311203] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:54.740 [2024-10-04 08:26:47.330115] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:54.740 [2024-10-04 08:26:47.330238] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.740 [2024-10-04 08:26:47.381720] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:54.740 [2024-10-04 08:26:47.398019] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:54.740 INFO: Running with entropic power schedule (0xFF, 100). 00:08:54.740 INFO: Seed: 2710691923 00:08:54.998 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:54.998 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:54.998 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:54.998 INFO: A corpus is not provided, starting from an empty corpus 00:08:54.998 #2 INITED exec/s: 0 rss: 59Mb 00:08:54.998 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:54.998 This may also happen if the target rejected all inputs we tried so far 00:08:54.998 [2024-10-04 08:26:47.463257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.998 [2024-10-04 08:26:47.463287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.256 NEW_FUNC[1/669]: 0x45c8c8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:55.256 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:55.256 #3 NEW cov: 11533 ft: 11534 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:08:55.256 [2024-10-04 08:26:47.774137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000850a cdw11:00000000 00:08:55.256 [2024-10-04 08:26:47.774180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.256 #5 NEW cov: 11646 ft: 12106 corp: 3/5b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 2 ShuffleBytes-InsertByte- 00:08:55.257 [2024-10-04 08:26:47.814093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008585 cdw11:00000000 00:08:55.257 [2024-10-04 08:26:47.814119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.257 #8 NEW cov: 11652 ft: 12350 corp: 4/7b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 3 ShuffleBytes-CrossOver-CopyPart- 00:08:55.257 [2024-10-04 08:26:47.844177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000851a cdw11:00000000 00:08:55.257 [2024-10-04 08:26:47.844208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.257 #9 NEW cov: 11737 ft: 12624 corp: 5/9b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeBit- 00:08:55.257 [2024-10-04 08:26:47.884284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000af8 cdw11:00000000 00:08:55.257 [2024-10-04 08:26:47.884312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.257 #10 NEW cov: 11737 ft: 12779 corp: 6/11b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeByte- 00:08:55.257 [2024-10-04 08:26:47.924419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000af8 cdw11:00000000 00:08:55.257 [2024-10-04 08:26:47.924445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.515 #11 NEW cov: 11737 ft: 12868 corp: 7/14b lim: 10 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 CopyPart- 00:08:55.515 [2024-10-04 08:26:47.964531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:55.515 [2024-10-04 08:26:47.964558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.515 #12 NEW cov: 11737 ft: 12972 corp: 8/16b lim: 10 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CrossOver- 00:08:55.515 [2024-10-04 08:26:48.005094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008540 cdw11:00000000 00:08:55.515 [2024-10-04 08:26:48.005120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.515 [2024-10-04 08:26:48.005172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004040 cdw11:00000000 00:08:55.515 [2024-10-04 08:26:48.005191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.515 [2024-10-04 08:26:48.005246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004040 cdw11:00000000 00:08:55.515 [2024-10-04 08:26:48.005259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:55.515 [2024-10-04 08:26:48.005311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004040 cdw11:00000000 00:08:55.515 [2024-10-04 08:26:48.005324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:55.515 [2024-10-04 08:26:48.005376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000401a cdw11:00000000 00:08:55.515 [2024-10-04 08:26:48.005390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:55.515 #13 NEW cov: 11737 ft: 13351 corp: 9/26b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:55.515 [2024-10-04 08:26:48.044759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a85 cdw11:00000000 00:08:55.515 [2024-10-04 08:26:48.044786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.515 #14 NEW cov: 11737 ft: 13400 corp: 10/29b lim: 10 exec/s: 0 rss: 67Mb L: 3/10 MS: 1 CrossOver- 00:08:55.515 [2024-10-04 08:26:48.075025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a5f cdw11:00000000 00:08:55.516 [2024-10-04 08:26:48.075051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.516 [2024-10-04 08:26:48.075104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000850a cdw11:00000000 00:08:55.516 [2024-10-04 08:26:48.075118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.516 #15 NEW cov: 11737 ft: 13575 corp: 11/33b lim: 10 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 InsertByte- 00:08:55.516 [2024-10-04 08:26:48.115137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000285f cdw11:00000000 00:08:55.516 [2024-10-04 08:26:48.115163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.516 [2024-10-04 08:26:48.115220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000850a cdw11:00000000 00:08:55.516 [2024-10-04 08:26:48.115234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.516 #16 NEW cov: 11737 ft: 13678 corp: 12/37b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 ChangeByte- 00:08:55.516 [2024-10-04 08:26:48.155231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a5f cdw11:00000000 00:08:55.516 [2024-10-04 08:26:48.155258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.516 [2024-10-04 08:26:48.155314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000050a cdw11:00000000 00:08:55.516 [2024-10-04 08:26:48.155328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.516 #17 NEW cov: 11737 ft: 13704 corp: 13/41b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 ChangeBit- 00:08:55.516 [2024-10-04 08:26:48.195398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000885f cdw11:00000000 00:08:55.516 [2024-10-04 08:26:48.195425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.516 [2024-10-04 08:26:48.195476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000850a cdw11:00000000 00:08:55.516 [2024-10-04 08:26:48.195490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.774 #18 NEW cov: 11737 ft: 13755 corp: 14/45b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 ChangeByte- 00:08:55.775 [2024-10-04 08:26:48.235325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002685 cdw11:00000000 00:08:55.775 [2024-10-04 08:26:48.235353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.775 #19 NEW cov: 11737 ft: 13801 corp: 15/48b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 1 InsertByte- 00:08:55.775 [2024-10-04 08:26:48.275475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000851a cdw11:00000000 00:08:55.775 [2024-10-04 08:26:48.275501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.775 #20 NEW cov: 11737 ft: 13896 corp: 16/50b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ChangeByte- 00:08:55.775 [2024-10-04 08:26:48.315711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a05 cdw11:00000000 00:08:55.775 [2024-10-04 08:26:48.315737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.775 [2024-10-04 08:26:48.315791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:55.775 [2024-10-04 08:26:48.315805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.775 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:55.775 #21 NEW cov: 11760 ft: 13904 corp: 17/54b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 CopyPart- 00:08:55.775 [2024-10-04 08:26:48.355842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008805 cdw11:00000000 00:08:55.775 [2024-10-04 08:26:48.355869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.775 [2024-10-04 08:26:48.355919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:55.775 [2024-10-04 08:26:48.355933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.775 #22 NEW cov: 11760 ft: 13913 corp: 18/58b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 CrossOver- 00:08:55.775 [2024-10-04 08:26:48.395836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002c85 cdw11:00000000 00:08:55.775 [2024-10-04 08:26:48.395862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.775 #23 NEW cov: 11760 ft: 13963 corp: 19/61b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 1 ChangeByte- 00:08:55.775 [2024-10-04 08:26:48.435970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002885 cdw11:00000000 00:08:55.775 [2024-10-04 08:26:48.435999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.034 #24 NEW cov: 11760 ft: 14003 corp: 20/64b lim: 10 exec/s: 24 rss: 68Mb L: 3/10 MS: 1 EraseBytes- 00:08:56.034 [2024-10-04 08:26:48.476588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008540 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.476614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.476667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004040 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.476680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.476732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004040 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.476745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.476795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004040 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.476808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.476859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000401b cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.476873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:56.034 #25 NEW cov: 11760 ft: 14017 corp: 21/74b lim: 10 exec/s: 25 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:08:56.034 [2024-10-04 08:26:48.516444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.516470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.516524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.516538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.516591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.516604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.034 #26 NEW cov: 11760 ft: 14152 corp: 22/80b lim: 10 exec/s: 26 rss: 68Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:08:56.034 [2024-10-04 08:26:48.556435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a5f cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.556460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.556511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000600 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.556525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.034 #27 NEW cov: 11760 ft: 14160 corp: 23/84b lim: 10 exec/s: 27 rss: 68Mb L: 4/10 MS: 1 CMP- DE: "\006\000"- 00:08:56.034 [2024-10-04 08:26:48.596414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.596439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.034 #28 NEW cov: 11760 ft: 14212 corp: 24/86b lim: 10 exec/s: 28 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:56.034 [2024-10-04 08:26:48.636551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008518 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.636579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.034 #29 NEW cov: 11760 ft: 14224 corp: 25/88b lim: 10 exec/s: 29 rss: 68Mb L: 2/10 MS: 1 ChangeBit- 00:08:56.034 [2024-10-04 08:26:48.666624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007e0a cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.666650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.034 #30 NEW cov: 11760 ft: 14230 corp: 26/91b lim: 10 exec/s: 30 rss: 68Mb L: 3/10 MS: 1 InsertByte- 00:08:56.034 [2024-10-04 08:26:48.707149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.707174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.707245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.707260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.707312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001205 cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.707326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.034 [2024-10-04 08:26:48.707377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:56.034 [2024-10-04 08:26:48.707391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:56.293 #31 NEW cov: 11760 ft: 14277 corp: 27/99b lim: 10 exec/s: 31 rss: 69Mb L: 8/10 MS: 1 CMP- DE: "\000\000\000\022"- 00:08:56.293 [2024-10-04 08:26:48.746885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a41 cdw11:00000000 00:08:56.293 [2024-10-04 08:26:48.746911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.293 #32 NEW cov: 11760 ft: 14286 corp: 28/101b lim: 10 exec/s: 32 rss: 69Mb L: 2/10 MS: 1 InsertByte- 00:08:56.293 [2024-10-04 08:26:48.776980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000028f8 cdw11:00000000 00:08:56.293 [2024-10-04 08:26:48.777006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.293 #33 NEW cov: 11760 ft: 14320 corp: 29/104b lim: 10 exec/s: 33 rss: 69Mb L: 3/10 MS: 1 CrossOver- 00:08:56.293 [2024-10-04 08:26:48.817233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a5f cdw11:00000000 00:08:56.293 [2024-10-04 08:26:48.817260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.293 [2024-10-04 08:26:48.817312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000850e cdw11:00000000 00:08:56.293 [2024-10-04 08:26:48.817326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.293 #34 NEW cov: 11760 ft: 14387 corp: 30/108b lim: 10 exec/s: 34 rss: 69Mb L: 4/10 MS: 1 ChangeBit- 00:08:56.293 [2024-10-04 08:26:48.847440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000851a cdw11:00000000 00:08:56.293 [2024-10-04 08:26:48.847476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.293 [2024-10-04 08:26:48.847531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:56.293 [2024-10-04 08:26:48.847549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.293 [2024-10-04 08:26:48.847590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000012 cdw11:00000000 00:08:56.293 [2024-10-04 08:26:48.847603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.293 #35 NEW cov: 11760 ft: 14401 corp: 31/114b lim: 10 exec/s: 35 rss: 69Mb L: 6/10 MS: 1 PersAutoDict- DE: "\000\000\000\022"- 00:08:56.293 [2024-10-04 08:26:48.887419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000eb05 cdw11:00000000 00:08:56.293 [2024-10-04 08:26:48.887445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.294 [2024-10-04 08:26:48.887497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:56.294 [2024-10-04 08:26:48.887511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.294 #36 NEW cov: 11760 ft: 14435 corp: 32/118b lim: 10 exec/s: 36 rss: 69Mb L: 4/10 MS: 1 ChangeByte- 00:08:56.294 [2024-10-04 08:26:48.927434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:56.294 [2024-10-04 08:26:48.927460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.294 #37 NEW cov: 11760 ft: 14457 corp: 33/120b lim: 10 exec/s: 37 rss: 69Mb L: 2/10 MS: 1 CopyPart- 00:08:56.294 [2024-10-04 08:26:48.957463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001885 cdw11:00000000 00:08:56.294 [2024-10-04 08:26:48.957488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.553 #38 NEW cov: 11760 ft: 14494 corp: 34/122b lim: 10 exec/s: 38 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:56.553 [2024-10-04 08:26:48.997859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a05 cdw11:00000000 00:08:56.553 [2024-10-04 08:26:48.997885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.553 [2024-10-04 08:26:48.997939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:56.553 [2024-10-04 08:26:48.997953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.553 [2024-10-04 08:26:48.998008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000050a cdw11:00000000 00:08:56.553 [2024-10-04 08:26:48.998022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:56.553 #39 NEW cov: 11760 ft: 14513 corp: 35/129b lim: 10 exec/s: 39 rss: 69Mb L: 7/10 MS: 1 CopyPart- 00:08:56.553 [2024-10-04 08:26:49.037835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ab2 cdw11:00000000 00:08:56.553 [2024-10-04 08:26:49.037861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.553 [2024-10-04 08:26:49.037915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005f85 cdw11:00000000 00:08:56.553 [2024-10-04 08:26:49.037929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.554 #40 NEW cov: 11760 ft: 14519 corp: 36/134b lim: 10 exec/s: 40 rss: 69Mb L: 5/10 MS: 1 InsertByte- 00:08:56.554 [2024-10-04 08:26:49.077888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002f0a cdw11:00000000 00:08:56.554 [2024-10-04 08:26:49.077916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.554 #41 NEW cov: 11760 ft: 14522 corp: 37/136b lim: 10 exec/s: 41 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:08:56.554 [2024-10-04 08:26:49.107915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000850a cdw11:00000000 00:08:56.554 [2024-10-04 08:26:49.107940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.554 #42 NEW cov: 11760 ft: 14531 corp: 38/139b lim: 10 exec/s: 42 rss: 69Mb L: 3/10 MS: 1 ShuffleBytes- 00:08:56.554 [2024-10-04 08:26:49.148035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a0a cdw11:00000000 00:08:56.554 [2024-10-04 08:26:49.148060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.554 #43 NEW cov: 11760 ft: 14538 corp: 39/141b lim: 10 exec/s: 43 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:08:56.554 [2024-10-04 08:26:49.178269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a41 cdw11:00000000 00:08:56.554 [2024-10-04 08:26:49.178295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.554 [2024-10-04 08:26:49.178348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000850a cdw11:00000000 00:08:56.554 [2024-10-04 08:26:49.178361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.554 #44 NEW cov: 11760 ft: 14551 corp: 40/145b lim: 10 exec/s: 44 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:08:56.554 [2024-10-04 08:26:49.218359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a05 cdw11:00000000 00:08:56.554 [2024-10-04 08:26:49.218385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.554 [2024-10-04 08:26:49.218438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:56.554 [2024-10-04 08:26:49.218452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.813 #45 NEW cov: 11760 ft: 14566 corp: 41/150b lim: 10 exec/s: 45 rss: 69Mb L: 5/10 MS: 1 EraseBytes- 00:08:56.813 [2024-10-04 08:26:49.258384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006f0a cdw11:00000000 00:08:56.813 [2024-10-04 08:26:49.258410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.814 #46 NEW cov: 11760 ft: 14582 corp: 42/152b lim: 10 exec/s: 46 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:08:56.814 [2024-10-04 08:26:49.298523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a60a cdw11:00000000 00:08:56.814 [2024-10-04 08:26:49.298549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.814 #47 NEW cov: 11760 ft: 14665 corp: 43/154b lim: 10 exec/s: 47 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:08:56.814 [2024-10-04 08:26:49.338719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000285f cdw11:00000000 00:08:56.814 [2024-10-04 08:26:49.338746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.814 [2024-10-04 08:26:49.338798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007b85 cdw11:00000000 00:08:56.814 [2024-10-04 08:26:49.338812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.814 #48 NEW cov: 11760 ft: 14702 corp: 44/159b lim: 10 exec/s: 48 rss: 69Mb L: 5/10 MS: 1 InsertByte- 00:08:56.814 [2024-10-04 08:26:49.378867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005f85 cdw11:00000000 00:08:56.814 [2024-10-04 08:26:49.378896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.814 [2024-10-04 08:26:49.378951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000850a cdw11:00000000 00:08:56.814 [2024-10-04 08:26:49.378964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:56.814 #49 NEW cov: 11760 ft: 14709 corp: 45/163b lim: 10 exec/s: 49 rss: 69Mb L: 4/10 MS: 1 CopyPart- 00:08:56.814 [2024-10-04 08:26:49.418824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000028f8 cdw11:00000000 00:08:56.814 [2024-10-04 08:26:49.418850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.814 #50 NEW cov: 11760 ft: 14724 corp: 46/166b lim: 10 exec/s: 25 rss: 69Mb L: 3/10 MS: 1 CopyPart- 00:08:56.814 #50 DONE cov: 11760 ft: 14724 corp: 46/166b lim: 10 exec/s: 25 rss: 69Mb 00:08:56.814 ###### Recommended dictionary. ###### 00:08:56.814 "\006\000" # Uses: 0 00:08:56.814 "\000\000\000\022" # Uses: 1 00:08:56.814 ###### End of recommended dictionary. ###### 00:08:56.814 Done 50 runs in 2 second(s) 00:08:57.073 08:26:49 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:08:57.073 08:26:49 -- ../common.sh@72 -- # (( i++ )) 00:08:57.073 08:26:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:57.073 08:26:49 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:57.073 08:26:49 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:57.073 08:26:49 -- nvmf/run.sh@24 -- # local timen=1 00:08:57.073 08:26:49 -- nvmf/run.sh@25 -- # local core=0x1 00:08:57.073 08:26:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:57.073 08:26:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:57.073 08:26:49 -- nvmf/run.sh@29 -- # printf %02d 8 00:08:57.073 08:26:49 -- nvmf/run.sh@29 -- # port=4408 00:08:57.073 08:26:49 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:57.073 08:26:49 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:57.073 08:26:49 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:57.073 08:26:49 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:08:57.074 [2024-10-04 08:26:49.600454] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:57.074 [2024-10-04 08:26:49.600524] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1015563 ] 00:08:57.074 EAL: No free 2048 kB hugepages reported on node 1 00:08:57.333 [2024-10-04 08:26:49.780283] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:57.333 [2024-10-04 08:26:49.799641] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:57.333 [2024-10-04 08:26:49.799765] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.333 [2024-10-04 08:26:49.851124] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:57.333 [2024-10-04 08:26:49.867494] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:57.333 INFO: Running with entropic power schedule (0xFF, 100). 00:08:57.333 INFO: Seed: 883729646 00:08:57.333 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:57.333 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:57.333 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:57.333 INFO: A corpus is not provided, starting from an empty corpus 00:08:57.333 [2024-10-04 08:26:49.937544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.333 [2024-10-04 08:26:49.937582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.333 #2 INITED cov: 11561 ft: 11556 corp: 1/1b exec/s: 0 rss: 65Mb 00:08:57.333 [2024-10-04 08:26:49.987462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.333 [2024-10-04 08:26:49.987491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.593 #3 NEW cov: 11674 ft: 12214 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ShuffleBytes- 00:08:57.593 [2024-10-04 08:26:50.047782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.047811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.593 #4 NEW cov: 11680 ft: 12401 corp: 3/3b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:08:57.593 [2024-10-04 08:26:50.108790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.108823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.593 [2024-10-04 08:26:50.108953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.108971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.593 [2024-10-04 08:26:50.109100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.109120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.593 [2024-10-04 08:26:50.109245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.109264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.593 #5 NEW cov: 11765 ft: 13452 corp: 4/7b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:57.593 [2024-10-04 08:26:50.158316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.158345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.593 [2024-10-04 08:26:50.158468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.158485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.593 #6 NEW cov: 11765 ft: 13713 corp: 5/9b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 CopyPart- 00:08:57.593 [2024-10-04 08:26:50.208420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.208449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.593 [2024-10-04 08:26:50.208584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.208607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.593 #7 NEW cov: 11765 ft: 13924 corp: 6/11b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 InsertByte- 00:08:57.593 [2024-10-04 08:26:50.259258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.259286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.593 [2024-10-04 08:26:50.259422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.259442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.593 [2024-10-04 08:26:50.259580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.259596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.593 [2024-10-04 08:26:50.259728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.593 [2024-10-04 08:26:50.259745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.853 #8 NEW cov: 11765 ft: 13985 corp: 7/15b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:57.853 [2024-10-04 08:26:50.319384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.319411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.319579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.319597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.319727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.319745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.319870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.319888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.853 #9 NEW cov: 11765 ft: 14043 corp: 8/19b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 ChangeBit- 00:08:57.853 [2024-10-04 08:26:50.378624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.378652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.853 #10 NEW cov: 11765 ft: 14072 corp: 9/20b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 ChangeBit- 00:08:57.853 [2024-10-04 08:26:50.429730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.429758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.429899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.429916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.430046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.430075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.430213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.430242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.853 #11 NEW cov: 11765 ft: 14156 corp: 10/24b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 ChangeBit- 00:08:57.853 [2024-10-04 08:26:50.489838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.489864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.490001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.490019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.490145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.490161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.853 [2024-10-04 08:26:50.490297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.853 [2024-10-04 08:26:50.490314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.853 #12 NEW cov: 11765 ft: 14194 corp: 11/28b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 ChangeBit- 00:08:58.113 [2024-10-04 08:26:50.550448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.550476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.550614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.550630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.550764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.550779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.550921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.550939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.551077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.551093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:58.113 #13 NEW cov: 11765 ft: 14303 corp: 12/33b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertByte- 00:08:58.113 [2024-10-04 08:26:50.600297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.600326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.600452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.600469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.600597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.600615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.600747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.600766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.113 #14 NEW cov: 11765 ft: 14376 corp: 13/37b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:58.113 [2024-10-04 08:26:50.660810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.660837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.660964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.660986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.661119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.661136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.661272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.661290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.661420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.661437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:58.113 #15 NEW cov: 11765 ft: 14416 corp: 14/42b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CrossOver- 00:08:58.113 [2024-10-04 08:26:50.720069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.113 [2024-10-04 08:26:50.720098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.113 [2024-10-04 08:26:50.720255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.114 [2024-10-04 08:26:50.720272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.114 #16 NEW cov: 11765 ft: 14435 corp: 15/44b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:08:58.114 [2024-10-04 08:26:50.770421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.114 [2024-10-04 08:26:50.770449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.114 [2024-10-04 08:26:50.770599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.114 [2024-10-04 08:26:50.770617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.114 [2024-10-04 08:26:50.770748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.114 [2024-10-04 08:26:50.770765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.373 #17 NEW cov: 11765 ft: 14663 corp: 16/47b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 CrossOver- 00:08:58.373 [2024-10-04 08:26:50.829923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.373 [2024-10-04 08:26:50.829949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.632 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:58.632 #18 NEW cov: 11788 ft: 14694 corp: 17/48b lim: 5 exec/s: 18 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:08:58.632 [2024-10-04 08:26:51.151843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.151878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.632 [2024-10-04 08:26:51.151999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.152018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.632 [2024-10-04 08:26:51.152152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.152170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.632 [2024-10-04 08:26:51.152314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.152334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.632 #19 NEW cov: 11788 ft: 14701 corp: 18/52b lim: 5 exec/s: 19 rss: 68Mb L: 4/5 MS: 1 CopyPart- 00:08:58.632 [2024-10-04 08:26:51.211933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.211961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.632 [2024-10-04 08:26:51.212095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.212113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.632 [2024-10-04 08:26:51.212232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.212249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.632 [2024-10-04 08:26:51.212383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.212399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.632 #20 NEW cov: 11788 ft: 14705 corp: 19/56b lim: 5 exec/s: 20 rss: 68Mb L: 4/5 MS: 1 CrossOver- 00:08:58.632 [2024-10-04 08:26:51.271557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.271584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.632 [2024-10-04 08:26:51.271722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.632 [2024-10-04 08:26:51.271738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.632 #21 NEW cov: 11788 ft: 14723 corp: 20/58b lim: 5 exec/s: 21 rss: 68Mb L: 2/5 MS: 1 EraseBytes- 00:08:58.892 [2024-10-04 08:26:51.332721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.332748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.332882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.332900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.333037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.333052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.333184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.333208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.333346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.333365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:58.892 #22 NEW cov: 11788 ft: 14730 corp: 21/63b lim: 5 exec/s: 22 rss: 68Mb L: 5/5 MS: 1 CopyPart- 00:08:58.892 [2024-10-04 08:26:51.381665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.381698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.892 #23 NEW cov: 11788 ft: 14746 corp: 22/64b lim: 5 exec/s: 23 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:58.892 [2024-10-04 08:26:51.432941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.432970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.433109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.433126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.433271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.433289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.433418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.433435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.433567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.433583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:58.892 #24 NEW cov: 11788 ft: 14786 corp: 23/69b lim: 5 exec/s: 24 rss: 68Mb L: 5/5 MS: 1 InsertByte- 00:08:58.892 [2024-10-04 08:26:51.492331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.492370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.492513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.492532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.892 #25 NEW cov: 11788 ft: 14799 corp: 24/71b lim: 5 exec/s: 25 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:08:58.892 [2024-10-04 08:26:51.553286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.553314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.553465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.553482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.553616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.553633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.553759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.553780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.892 [2024-10-04 08:26:51.553907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:58.892 [2024-10-04 08:26:51.553923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:59.151 #26 NEW cov: 11788 ft: 14851 corp: 25/76b lim: 5 exec/s: 26 rss: 69Mb L: 5/5 MS: 1 CrossOver- 00:08:59.152 [2024-10-04 08:26:51.612643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.612670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.152 [2024-10-04 08:26:51.612814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.612834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.152 #27 NEW cov: 11788 ft: 14853 corp: 26/78b lim: 5 exec/s: 27 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:08:59.152 [2024-10-04 08:26:51.663301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.663328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.152 [2024-10-04 08:26:51.663460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.663478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.152 [2024-10-04 08:26:51.663598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.663615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:59.152 [2024-10-04 08:26:51.663740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.663756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:59.152 #28 NEW cov: 11788 ft: 14925 corp: 27/82b lim: 5 exec/s: 28 rss: 69Mb L: 4/5 MS: 1 ChangeBit- 00:08:59.152 [2024-10-04 08:26:51.713236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.713263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.152 [2024-10-04 08:26:51.713435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.713451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.152 [2024-10-04 08:26:51.713577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.713594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:59.152 #29 NEW cov: 11788 ft: 14932 corp: 28/85b lim: 5 exec/s: 29 rss: 69Mb L: 3/5 MS: 1 EraseBytes- 00:08:59.152 [2024-10-04 08:26:51.773217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.773243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.152 [2024-10-04 08:26:51.773393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.773409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.152 #30 NEW cov: 11788 ft: 14976 corp: 29/87b lim: 5 exec/s: 30 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:08:59.152 [2024-10-04 08:26:51.823127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.823156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.152 [2024-10-04 08:26:51.823263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.152 [2024-10-04 08:26:51.823281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.411 #31 NEW cov: 11788 ft: 14992 corp: 30/89b lim: 5 exec/s: 31 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:08:59.411 [2024-10-04 08:26:51.874159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.411 [2024-10-04 08:26:51.874185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.411 [2024-10-04 08:26:51.874315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.411 [2024-10-04 08:26:51.874330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.411 [2024-10-04 08:26:51.874461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.412 [2024-10-04 08:26:51.874479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:59.412 [2024-10-04 08:26:51.874606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.412 [2024-10-04 08:26:51.874622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:59.412 #32 NEW cov: 11788 ft: 15016 corp: 31/93b lim: 5 exec/s: 32 rss: 69Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:59.412 [2024-10-04 08:26:51.924211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.412 [2024-10-04 08:26:51.924238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.412 [2024-10-04 08:26:51.924363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.412 [2024-10-04 08:26:51.924378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.412 [2024-10-04 08:26:51.924503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.412 [2024-10-04 08:26:51.924519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:59.412 [2024-10-04 08:26:51.924659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.412 [2024-10-04 08:26:51.924676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:59.412 #33 NEW cov: 11788 ft: 15017 corp: 32/97b lim: 5 exec/s: 16 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:59.412 #33 DONE cov: 11788 ft: 15017 corp: 32/97b lim: 5 exec/s: 16 rss: 69Mb 00:08:59.412 Done 33 runs in 2 second(s) 00:08:59.412 08:26:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:08:59.412 08:26:52 -- ../common.sh@72 -- # (( i++ )) 00:08:59.412 08:26:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:59.412 08:26:52 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:59.412 08:26:52 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:59.412 08:26:52 -- nvmf/run.sh@24 -- # local timen=1 00:08:59.412 08:26:52 -- nvmf/run.sh@25 -- # local core=0x1 00:08:59.412 08:26:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:59.412 08:26:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:59.412 08:26:52 -- nvmf/run.sh@29 -- # printf %02d 9 00:08:59.412 08:26:52 -- nvmf/run.sh@29 -- # port=4409 00:08:59.412 08:26:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:59.412 08:26:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:59.412 08:26:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:59.412 08:26:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:08:59.671 [2024-10-04 08:26:52.108227] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:59.671 [2024-10-04 08:26:52.108297] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1016044 ] 00:08:59.671 EAL: No free 2048 kB hugepages reported on node 1 00:08:59.671 [2024-10-04 08:26:52.285912] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.671 [2024-10-04 08:26:52.306103] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:59.671 [2024-10-04 08:26:52.306234] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:59.930 [2024-10-04 08:26:52.357651] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:59.930 [2024-10-04 08:26:52.373944] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:59.930 INFO: Running with entropic power schedule (0xFF, 100). 00:08:59.930 INFO: Seed: 3390700559 00:08:59.930 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:59.930 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:59.930 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:59.930 INFO: A corpus is not provided, starting from an empty corpus 00:08:59.930 [2024-10-04 08:26:52.439221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.931 [2024-10-04 08:26:52.439251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.931 #2 INITED cov: 11561 ft: 11561 corp: 1/1b exec/s: 0 rss: 66Mb 00:08:59.931 [2024-10-04 08:26:52.469365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.931 [2024-10-04 08:26:52.469399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.931 [2024-10-04 08:26:52.469453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.931 [2024-10-04 08:26:52.469467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.931 #3 NEW cov: 11674 ft: 12767 corp: 2/3b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CrossOver- 00:08:59.931 [2024-10-04 08:26:52.519509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.931 [2024-10-04 08:26:52.519537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.931 [2024-10-04 08:26:52.519592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.931 [2024-10-04 08:26:52.519606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.931 #4 NEW cov: 11680 ft: 12954 corp: 3/5b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:08:59.931 [2024-10-04 08:26:52.559622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.931 [2024-10-04 08:26:52.559650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.931 [2024-10-04 08:26:52.559705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.931 [2024-10-04 08:26:52.559719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.931 #5 NEW cov: 11765 ft: 13155 corp: 4/7b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:08:59.931 [2024-10-04 08:26:52.599529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.931 [2024-10-04 08:26:52.599556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.190 #6 NEW cov: 11765 ft: 13338 corp: 5/8b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 EraseBytes- 00:09:00.190 [2024-10-04 08:26:52.639992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.640019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.190 [2024-10-04 08:26:52.640076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.640091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.190 [2024-10-04 08:26:52.640146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.640161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.190 #7 NEW cov: 11765 ft: 13578 corp: 6/11b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 InsertByte- 00:09:00.190 [2024-10-04 08:26:52.679961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.679988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.190 [2024-10-04 08:26:52.680047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.680061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.190 #8 NEW cov: 11765 ft: 13665 corp: 7/13b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CopyPart- 00:09:00.190 [2024-10-04 08:26:52.720217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.720244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.190 [2024-10-04 08:26:52.720302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.720316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.190 [2024-10-04 08:26:52.720370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.720383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.190 #9 NEW cov: 11765 ft: 13697 corp: 8/16b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ShuffleBytes- 00:09:00.190 [2024-10-04 08:26:52.760017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.760043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.190 #10 NEW cov: 11765 ft: 13708 corp: 9/17b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 EraseBytes- 00:09:00.190 [2024-10-04 08:26:52.800120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.800146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.190 #11 NEW cov: 11765 ft: 13777 corp: 10/18b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 CopyPart- 00:09:00.190 [2024-10-04 08:26:52.840606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.840632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.190 [2024-10-04 08:26:52.840689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.840704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.190 [2024-10-04 08:26:52.840759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.190 [2024-10-04 08:26:52.840773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.190 #12 NEW cov: 11765 ft: 13816 corp: 11/21b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeByte- 00:09:00.449 [2024-10-04 08:26:52.880720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:52.880747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.449 [2024-10-04 08:26:52.880801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:52.880818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.449 [2024-10-04 08:26:52.880871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:52.880884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.449 #13 NEW cov: 11765 ft: 13845 corp: 12/24b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:09:00.449 [2024-10-04 08:26:52.920491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:52.920517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.449 #14 NEW cov: 11765 ft: 13882 corp: 13/25b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ShuffleBytes- 00:09:00.449 [2024-10-04 08:26:52.960900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:52.960925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.449 [2024-10-04 08:26:52.960979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:52.960993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.449 [2024-10-04 08:26:52.961048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:52.961061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.449 #15 NEW cov: 11765 ft: 13901 corp: 14/28b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:09:00.449 [2024-10-04 08:26:53.001008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:53.001035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.449 [2024-10-04 08:26:53.001091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:53.001106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.449 [2024-10-04 08:26:53.001158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:53.001171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.449 #16 NEW cov: 11765 ft: 13918 corp: 15/31b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeBinInt- 00:09:00.449 [2024-10-04 08:26:53.041029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:53.041055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.449 #17 NEW cov: 11765 ft: 14046 corp: 16/32b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeBit- 00:09:00.449 [2024-10-04 08:26:53.081132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:53.081164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.449 [2024-10-04 08:26:53.081222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.449 [2024-10-04 08:26:53.081236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.449 #18 NEW cov: 11765 ft: 14059 corp: 17/34b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CopyPart- 00:09:00.450 [2024-10-04 08:26:53.121260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.450 [2024-10-04 08:26:53.121286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.450 [2024-10-04 08:26:53.121340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.450 [2024-10-04 08:26:53.121354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.708 #19 NEW cov: 11765 ft: 14102 corp: 18/36b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CrossOver- 00:09:00.708 [2024-10-04 08:26:53.161178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.708 [2024-10-04 08:26:53.161207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.708 #20 NEW cov: 11765 ft: 14130 corp: 19/37b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 EraseBytes- 00:09:00.708 [2024-10-04 08:26:53.201473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.708 [2024-10-04 08:26:53.201499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.708 [2024-10-04 08:26:53.201553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.708 [2024-10-04 08:26:53.201567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.708 #21 NEW cov: 11765 ft: 14146 corp: 20/39b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 EraseBytes- 00:09:00.708 [2024-10-04 08:26:53.241735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.708 [2024-10-04 08:26:53.241761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.708 [2024-10-04 08:26:53.241816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.708 [2024-10-04 08:26:53.241830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.241883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.241897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.709 #22 NEW cov: 11765 ft: 14185 corp: 21/42b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 CopyPart- 00:09:00.709 [2024-10-04 08:26:53.281872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.281901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.281956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.281971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.282025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.282038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.709 #23 NEW cov: 11765 ft: 14203 corp: 22/45b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 InsertByte- 00:09:00.709 [2024-10-04 08:26:53.322290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.322316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.322371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.322385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.322438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.322451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.322501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.322514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.322567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.322580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.709 #24 NEW cov: 11765 ft: 14508 corp: 23/50b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CMP- DE: " \000"- 00:09:00.709 [2024-10-04 08:26:53.362263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.362290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.362344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.362360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.362413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.362426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.709 [2024-10-04 08:26:53.362478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.709 [2024-10-04 08:26:53.362491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.968 #25 NEW cov: 11765 ft: 14530 corp: 24/54b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 CopyPart- 00:09:00.968 [2024-10-04 08:26:53.401916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.401943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.968 #26 NEW cov: 11765 ft: 14558 corp: 25/55b lim: 5 exec/s: 26 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:09:00.968 [2024-10-04 08:26:53.442347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.442373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.442429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.442443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.442498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.442511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.968 #27 NEW cov: 11765 ft: 14573 corp: 26/58b lim: 5 exec/s: 27 rss: 68Mb L: 3/5 MS: 1 ChangeByte- 00:09:00.968 [2024-10-04 08:26:53.482441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.482469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.482525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.482539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.482595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.482609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.968 #28 NEW cov: 11765 ft: 14590 corp: 27/61b lim: 5 exec/s: 28 rss: 68Mb L: 3/5 MS: 1 ChangeByte- 00:09:00.968 [2024-10-04 08:26:53.522417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.522444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.522499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.522513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.552487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.552514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.552567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.552585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.968 #30 NEW cov: 11765 ft: 14595 corp: 28/63b lim: 5 exec/s: 30 rss: 68Mb L: 2/5 MS: 2 CrossOver-PersAutoDict- DE: " \000"- 00:09:00.968 [2024-10-04 08:26:53.592771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.592796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.592850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.592864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.968 [2024-10-04 08:26:53.592917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.592931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.968 #31 NEW cov: 11765 ft: 14652 corp: 29/66b lim: 5 exec/s: 31 rss: 68Mb L: 3/5 MS: 1 EraseBytes- 00:09:00.968 [2024-10-04 08:26:53.632922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.968 [2024-10-04 08:26:53.632949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.969 [2024-10-04 08:26:53.633004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.969 [2024-10-04 08:26:53.633019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.969 [2024-10-04 08:26:53.633084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.969 [2024-10-04 08:26:53.633097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:01.228 #32 NEW cov: 11765 ft: 14655 corp: 30/69b lim: 5 exec/s: 32 rss: 68Mb L: 3/5 MS: 1 CopyPart- 00:09:01.228 [2024-10-04 08:26:53.672700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.672726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.228 #33 NEW cov: 11765 ft: 14661 corp: 31/70b lim: 5 exec/s: 33 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:09:01.228 [2024-10-04 08:26:53.712973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.712999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.713056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.713071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.228 #34 NEW cov: 11765 ft: 14672 corp: 32/72b lim: 5 exec/s: 34 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:09:01.228 [2024-10-04 08:26:53.752897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.752927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.228 #35 NEW cov: 11765 ft: 14686 corp: 33/73b lim: 5 exec/s: 35 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:09:01.228 [2024-10-04 08:26:53.793358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.793385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.793442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.793456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.793511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.793525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:01.228 #36 NEW cov: 11765 ft: 14706 corp: 34/76b lim: 5 exec/s: 36 rss: 68Mb L: 3/5 MS: 1 ChangeBinInt- 00:09:01.228 [2024-10-04 08:26:53.833584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.833610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.833663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.833676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.833727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.833741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.833793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.833805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:01.228 #37 NEW cov: 11765 ft: 14741 corp: 35/80b lim: 5 exec/s: 37 rss: 68Mb L: 4/5 MS: 1 CopyPart- 00:09:01.228 [2024-10-04 08:26:53.873926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.873953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.874007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.874021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.874073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.874086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.874137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.874154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:01.228 [2024-10-04 08:26:53.874208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.228 [2024-10-04 08:26:53.874222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:01.228 #38 NEW cov: 11765 ft: 14760 corp: 36/85b lim: 5 exec/s: 38 rss: 68Mb L: 5/5 MS: 1 ChangeBit- 00:09:01.488 [2024-10-04 08:26:53.913507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.913533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:53.913597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.913611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.488 #39 NEW cov: 11765 ft: 14821 corp: 37/87b lim: 5 exec/s: 39 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:09:01.488 [2024-10-04 08:26:53.953794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.953821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:53.953876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.953890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:53.953945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.953960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:01.488 #40 NEW cov: 11765 ft: 14863 corp: 38/90b lim: 5 exec/s: 40 rss: 68Mb L: 3/5 MS: 1 ChangeBinInt- 00:09:01.488 [2024-10-04 08:26:53.994217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.994244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:53.994296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.994310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:53.994362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.994376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:53.994427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.994441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:53.994497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:53.994511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:01.488 #41 NEW cov: 11765 ft: 14869 corp: 39/95b lim: 5 exec/s: 41 rss: 68Mb L: 5/5 MS: 1 PersAutoDict- DE: " \000"- 00:09:01.488 [2024-10-04 08:26:54.034027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.034055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:54.034110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.034124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:54.034176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.034196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:01.488 #42 NEW cov: 11765 ft: 14880 corp: 40/98b lim: 5 exec/s: 42 rss: 68Mb L: 3/5 MS: 1 CMP- DE: "\377\011"- 00:09:01.488 [2024-10-04 08:26:54.073846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.073873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.488 #43 NEW cov: 11765 ft: 14978 corp: 41/99b lim: 5 exec/s: 43 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:09:01.488 [2024-10-04 08:26:54.114150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.114176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:54.114220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.114234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.488 #44 NEW cov: 11765 ft: 15042 corp: 42/101b lim: 5 exec/s: 44 rss: 68Mb L: 2/5 MS: 1 EraseBytes- 00:09:01.488 [2024-10-04 08:26:54.154427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.154453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:54.154508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.154522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.488 [2024-10-04 08:26:54.154576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.488 [2024-10-04 08:26:54.154589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:01.746 #45 NEW cov: 11765 ft: 15046 corp: 43/104b lim: 5 exec/s: 45 rss: 69Mb L: 3/5 MS: 1 PersAutoDict- DE: " \000"- 00:09:01.746 [2024-10-04 08:26:54.194164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.746 [2024-10-04 08:26:54.194199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.746 #46 NEW cov: 11765 ft: 15054 corp: 44/105b lim: 5 exec/s: 46 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:09:01.746 [2024-10-04 08:26:54.234431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.746 [2024-10-04 08:26:54.234457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.746 [2024-10-04 08:26:54.234508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.746 [2024-10-04 08:26:54.234522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.746 [2024-10-04 08:26:54.264407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.746 [2024-10-04 08:26:54.264434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.746 #48 NEW cov: 11765 ft: 15090 corp: 45/106b lim: 5 exec/s: 48 rss: 69Mb L: 1/5 MS: 2 CrossOver-EraseBytes- 00:09:01.746 [2024-10-04 08:26:54.304645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.746 [2024-10-04 08:26:54.304671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.746 [2024-10-04 08:26:54.304725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.746 [2024-10-04 08:26:54.304739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.005 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:02.005 #49 NEW cov: 11788 ft: 15120 corp: 46/108b lim: 5 exec/s: 24 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:09:02.005 #49 DONE cov: 11788 ft: 15120 corp: 46/108b lim: 5 exec/s: 24 rss: 70Mb 00:09:02.005 ###### Recommended dictionary. ###### 00:09:02.005 " \000" # Uses: 3 00:09:02.005 "\377\011" # Uses: 0 00:09:02.005 ###### End of recommended dictionary. ###### 00:09:02.005 Done 49 runs in 2 second(s) 00:09:02.265 08:26:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:09:02.265 08:26:54 -- ../common.sh@72 -- # (( i++ )) 00:09:02.265 08:26:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:02.265 08:26:54 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:09:02.265 08:26:54 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:09:02.265 08:26:54 -- nvmf/run.sh@24 -- # local timen=1 00:09:02.265 08:26:54 -- nvmf/run.sh@25 -- # local core=0x1 00:09:02.265 08:26:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:02.265 08:26:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:09:02.265 08:26:54 -- nvmf/run.sh@29 -- # printf %02d 10 00:09:02.265 08:26:54 -- nvmf/run.sh@29 -- # port=4410 00:09:02.265 08:26:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:02.265 08:26:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:09:02.265 08:26:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:02.265 08:26:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:09:02.265 [2024-10-04 08:26:54.757675] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:02.265 [2024-10-04 08:26:54.757769] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1016592 ] 00:09:02.265 EAL: No free 2048 kB hugepages reported on node 1 00:09:02.265 [2024-10-04 08:26:54.934034] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:02.524 [2024-10-04 08:26:54.953838] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:02.524 [2024-10-04 08:26:54.953957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.524 [2024-10-04 08:26:55.005195] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:02.524 [2024-10-04 08:26:55.021548] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:09:02.524 INFO: Running with entropic power schedule (0xFF, 100). 00:09:02.524 INFO: Seed: 1744740270 00:09:02.524 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:02.524 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:02.524 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:02.524 INFO: A corpus is not provided, starting from an empty corpus 00:09:02.524 #2 INITED exec/s: 0 rss: 59Mb 00:09:02.524 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:02.524 This may also happen if the target rejected all inputs we tried so far 00:09:02.524 [2024-10-04 08:26:55.076705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.524 [2024-10-04 08:26:55.076735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.784 NEW_FUNC[1/670]: 0x45e248 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:09:02.784 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:02.784 #14 NEW cov: 11584 ft: 11585 corp: 2/10b lim: 40 exec/s: 0 rss: 67Mb L: 9/9 MS: 2 ChangeByte-CMP- DE: "\377\377\377\377\377\377\377l"- 00:09:02.784 [2024-10-04 08:26:55.377991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:60848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.784 [2024-10-04 08:26:55.378022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.784 [2024-10-04 08:26:55.378081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.784 [2024-10-04 08:26:55.378095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.784 [2024-10-04 08:26:55.378152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.784 [2024-10-04 08:26:55.378167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:02.784 [2024-10-04 08:26:55.378225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.784 [2024-10-04 08:26:55.378239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:02.784 [2024-10-04 08:26:55.378289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:84848484 cdw11:84843131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.784 [2024-10-04 08:26:55.378306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:02.784 #19 NEW cov: 11697 ft: 12677 corp: 3/50b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 5 InsertByte-InsertByte-CopyPart-CopyPart-InsertRepeatedBytes- 00:09:02.784 [2024-10-04 08:26:55.417655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ff70ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.784 [2024-10-04 08:26:55.417682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:02.784 [2024-10-04 08:26:55.417740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff6c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.784 [2024-10-04 08:26:55.417756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:02.784 #20 NEW cov: 11703 ft: 13210 corp: 4/66b lim: 40 exec/s: 0 rss: 67Mb L: 16/40 MS: 1 CopyPart- 00:09:02.784 [2024-10-04 08:26:55.457622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:02.784 [2024-10-04 08:26:55.457650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.048 #21 NEW cov: 11788 ft: 13484 corp: 5/75b lim: 40 exec/s: 0 rss: 67Mb L: 9/40 MS: 1 ShuffleBytes- 00:09:03.048 [2024-10-04 08:26:55.497890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.497916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.049 [2024-10-04 08:26:55.497976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.497990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.049 #25 NEW cov: 11788 ft: 13571 corp: 6/93b lim: 40 exec/s: 0 rss: 67Mb L: 18/40 MS: 4 EraseBytes-ChangeBit-CopyPart-InsertRepeatedBytes- 00:09:03.049 [2024-10-04 08:26:55.537974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.538000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.049 [2024-10-04 08:26:55.538059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.538073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.049 #26 NEW cov: 11788 ft: 13713 corp: 7/115b lim: 40 exec/s: 0 rss: 67Mb L: 22/40 MS: 1 CMP- DE: "\001\000\000\000"- 00:09:03.049 [2024-10-04 08:26:55.578121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.578147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.049 [2024-10-04 08:26:55.578210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:dbffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.578225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.049 #27 NEW cov: 11788 ft: 13775 corp: 8/134b lim: 40 exec/s: 0 rss: 67Mb L: 19/40 MS: 1 InsertByte- 00:09:03.049 [2024-10-04 08:26:55.618048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ffff07 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.618078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.049 #28 NEW cov: 11788 ft: 13942 corp: 9/143b lim: 40 exec/s: 0 rss: 67Mb L: 9/40 MS: 1 ChangeBinInt- 00:09:03.049 [2024-10-04 08:26:55.658352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ff70ff cdw11:fffff7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.658378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.049 [2024-10-04 08:26:55.658437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff6c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.658452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.049 #29 NEW cov: 11788 ft: 14019 corp: 10/159b lim: 40 exec/s: 0 rss: 67Mb L: 16/40 MS: 1 ChangeBit- 00:09:03.049 [2024-10-04 08:26:55.698353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.049 [2024-10-04 08:26:55.698380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.049 #30 NEW cov: 11788 ft: 14040 corp: 11/173b lim: 40 exec/s: 0 rss: 67Mb L: 14/40 MS: 1 EraseBytes- 00:09:03.308 [2024-10-04 08:26:55.739025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:60318484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.308 [2024-10-04 08:26:55.739052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.308 [2024-10-04 08:26:55.739109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.308 [2024-10-04 08:26:55.739123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.308 [2024-10-04 08:26:55.739179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.308 [2024-10-04 08:26:55.739198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.308 [2024-10-04 08:26:55.739259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.308 [2024-10-04 08:26:55.739272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.308 [2024-10-04 08:26:55.739332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:84848484 cdw11:84843131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.308 [2024-10-04 08:26:55.739346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.308 #31 NEW cov: 11788 ft: 14142 corp: 12/213b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeByte- 00:09:03.308 [2024-10-04 08:26:55.778702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffce00 cdw11:0000ff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.778728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.778790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.778803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.309 #32 NEW cov: 11788 ft: 14193 corp: 13/235b lim: 40 exec/s: 0 rss: 67Mb L: 22/40 MS: 1 ChangeBinInt- 00:09:03.309 [2024-10-04 08:26:55.818937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.818963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.819018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff30ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.819032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.819088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.819102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.309 #33 NEW cov: 11788 ft: 14393 corp: 14/265b lim: 40 exec/s: 0 rss: 67Mb L: 30/40 MS: 1 CrossOver- 00:09:03.309 [2024-10-04 08:26:55.858778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffce00 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.858804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.309 #34 NEW cov: 11788 ft: 14501 corp: 15/280b lim: 40 exec/s: 0 rss: 67Mb L: 15/40 MS: 1 EraseBytes- 00:09:03.309 [2024-10-04 08:26:55.899443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:60848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.899469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.899528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.899541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.899598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.899612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.899667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.899681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.899732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:84848484 cdw11:84843131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.899745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.309 #35 NEW cov: 11788 ft: 14542 corp: 16/320b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 ShuffleBytes- 00:09:03.309 [2024-10-04 08:26:55.939518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:60608484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.939544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.939606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.939624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.939683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.939698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.939751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.939764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.309 [2024-10-04 08:26:55.939823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:84848484 cdw11:84843131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.309 [2024-10-04 08:26:55.939836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.309 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:03.309 #36 NEW cov: 11811 ft: 14599 corp: 17/360b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 ChangeByte- 00:09:03.568 [2024-10-04 08:26:55.989378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffce cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:55.989405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:55.989470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:55.989485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.569 #37 NEW cov: 11811 ft: 14625 corp: 18/383b lim: 40 exec/s: 0 rss: 68Mb L: 23/40 MS: 1 CrossOver- 00:09:03.569 [2024-10-04 08:26:56.029276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffff1cff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.029301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.569 #38 NEW cov: 11811 ft: 14685 corp: 19/398b lim: 40 exec/s: 38 rss: 68Mb L: 15/40 MS: 1 InsertByte- 00:09:03.569 [2024-10-04 08:26:56.069404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ff70ff cdw11:fffff7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.069431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.569 #39 NEW cov: 11811 ft: 14690 corp: 20/408b lim: 40 exec/s: 39 rss: 68Mb L: 10/40 MS: 1 EraseBytes- 00:09:03.569 [2024-10-04 08:26:56.109521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ffffff cdw11:010000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.109547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.569 #40 NEW cov: 11811 ft: 14745 corp: 21/417b lim: 40 exec/s: 40 rss: 68Mb L: 9/40 MS: 1 CrossOver- 00:09:03.569 [2024-10-04 08:26:56.150071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ff6464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.150097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:56.150156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.150173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:56.150237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.150251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:56.150286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.150300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.569 #41 NEW cov: 11811 ft: 14765 corp: 22/456b lim: 40 exec/s: 41 rss: 68Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:09:03.569 [2024-10-04 08:26:56.189922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000200 cdw11:ffff30ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.189948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:56.190007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.190021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.569 #42 NEW cov: 11811 ft: 14787 corp: 23/479b lim: 40 exec/s: 42 rss: 68Mb L: 23/40 MS: 1 CMP- DE: "\001\000\002\000"- 00:09:03.569 [2024-10-04 08:26:56.230412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:60848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.230437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:56.230496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.230510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:56.230547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.230560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:56.230621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.230634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.569 [2024-10-04 08:26:56.230693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:84848484 cdw11:84843035 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.569 [2024-10-04 08:26:56.230707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.829 #43 NEW cov: 11811 ft: 14822 corp: 24/519b lim: 40 exec/s: 43 rss: 68Mb L: 40/40 MS: 1 ChangeASCIIInt- 00:09:03.829 [2024-10-04 08:26:56.270258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ff70ffce cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.270283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.829 [2024-10-04 08:26:56.270351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:01ffffff cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.270365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.829 [2024-10-04 08:26:56.270427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.270441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.829 #44 NEW cov: 11811 ft: 14841 corp: 25/549b lim: 40 exec/s: 44 rss: 68Mb L: 30/40 MS: 1 CrossOver- 00:09:03.829 [2024-10-04 08:26:56.310278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ff6cffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.310303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.829 [2024-10-04 08:26:56.310362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff1cff01 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.310376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.829 #45 NEW cov: 11811 ft: 14856 corp: 26/567b lim: 40 exec/s: 45 rss: 68Mb L: 18/40 MS: 1 CopyPart- 00:09:03.829 [2024-10-04 08:26:56.350205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff300a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.350231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.829 #46 NEW cov: 11811 ft: 14879 corp: 27/582b lim: 40 exec/s: 46 rss: 68Mb L: 15/40 MS: 1 InsertByte- 00:09:03.829 [2024-10-04 08:26:56.390451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff300a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.390476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.829 [2024-10-04 08:26:56.390535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d0100ff cdw11:ffffff6c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.390549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.829 #47 NEW cov: 11811 ft: 14933 corp: 28/598b lim: 40 exec/s: 47 rss: 68Mb L: 16/40 MS: 1 InsertByte- 00:09:03.829 [2024-10-04 08:26:56.430474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ff6cffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.430500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.829 #48 NEW cov: 11811 ft: 14936 corp: 29/612b lim: 40 exec/s: 48 rss: 69Mb L: 14/40 MS: 1 EraseBytes- 00:09:03.829 [2024-10-04 08:26:56.471017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000200 cdw11:ffff30ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.471044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.829 [2024-10-04 08:26:56.471104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.471118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.829 [2024-10-04 08:26:56.471173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.471195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.829 [2024-10-04 08:26:56.471255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffdb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:03.829 [2024-10-04 08:26:56.471269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.829 #49 NEW cov: 11811 ft: 14944 corp: 30/646b lim: 40 exec/s: 49 rss: 69Mb L: 34/40 MS: 1 InsertRepeatedBytes- 00:09:04.090 [2024-10-04 08:26:56.510839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ff70ff cdw11:fffff7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.510865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.510926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff3bff6c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.510941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.090 #50 NEW cov: 11811 ft: 14951 corp: 31/662b lim: 40 exec/s: 50 rss: 69Mb L: 16/40 MS: 1 ChangeByte- 00:09:04.090 [2024-10-04 08:26:56.551118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.551144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.551213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff30ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.551228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.551283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.551297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.090 #51 NEW cov: 11811 ft: 14971 corp: 32/692b lim: 40 exec/s: 51 rss: 69Mb L: 30/40 MS: 1 CopyPart- 00:09:04.090 [2024-10-04 08:26:56.590943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffc0ff01 cdw11:00000030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.590968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.090 #56 NEW cov: 11811 ft: 14997 corp: 33/703b lim: 40 exec/s: 56 rss: 69Mb L: 11/40 MS: 5 EraseBytes-ShuffleBytes-ChangeASCIIInt-ChangeByte-PersAutoDict- DE: "\001\000\000\000"- 00:09:04.090 [2024-10-04 08:26:56.631198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.631223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.631285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffdbffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.631299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.090 #57 NEW cov: 11811 ft: 15006 corp: 34/723b lim: 40 exec/s: 57 rss: 69Mb L: 20/40 MS: 1 CrossOver- 00:09:04.090 [2024-10-04 08:26:56.671555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ff00ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.671583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.671649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff30ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.671663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.671720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff30ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.671734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.671796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.671811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.090 #58 NEW cov: 11811 ft: 15021 corp: 35/761b lim: 40 exec/s: 58 rss: 69Mb L: 38/40 MS: 1 CrossOver- 00:09:04.090 [2024-10-04 08:26:56.711419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffce00 cdw11:0000ff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.711444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.711505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.711519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.090 #59 NEW cov: 11811 ft: 15058 corp: 36/783b lim: 40 exec/s: 59 rss: 69Mb L: 22/40 MS: 1 ChangeBinInt- 00:09:04.090 [2024-10-04 08:26:56.751580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff3106 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.751606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.090 [2024-10-04 08:26:56.751664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:dbffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.090 [2024-10-04 08:26:56.751678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.351 #60 NEW cov: 11811 ft: 15072 corp: 37/802b lim: 40 exec/s: 60 rss: 69Mb L: 19/40 MS: 1 ChangeBinInt- 00:09:04.351 [2024-10-04 08:26:56.791800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:010000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.791825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.791883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:01ffffff cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.791897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.791957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.791971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.351 #61 NEW cov: 11811 ft: 15080 corp: 38/832b lim: 40 exec/s: 61 rss: 69Mb L: 30/40 MS: 1 CMP- DE: "\377\377\377\001"- 00:09:04.351 [2024-10-04 08:26:56.832181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:60848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.832211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.832271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.832286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.832345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84010002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.832360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.832420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.832434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.832494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:84848484 cdw11:84843131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.832509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:04.351 #62 NEW cov: 11811 ft: 15084 corp: 39/872b lim: 40 exec/s: 62 rss: 69Mb L: 40/40 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:09:04.351 [2024-10-04 08:26:56.872171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.872200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.872262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.872276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.872337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ff30ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.872363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.872437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffdb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.872452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.351 #63 NEW cov: 11811 ft: 15103 corp: 40/906b lim: 40 exec/s: 63 rss: 69Mb L: 34/40 MS: 1 CopyPart- 00:09:04.351 [2024-10-04 08:26:56.911879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff30ff cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.911904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.351 #64 NEW cov: 11811 ft: 15104 corp: 41/920b lim: 40 exec/s: 64 rss: 69Mb L: 14/40 MS: 1 ChangeBinInt- 00:09:04.351 [2024-10-04 08:26:56.942087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffff300a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.942116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:56.942174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d01ffff cdw11:00ffff6c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.942194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.351 #65 NEW cov: 11811 ft: 15112 corp: 42/936b lim: 40 exec/s: 65 rss: 69Mb L: 16/40 MS: 1 ShuffleBytes- 00:09:04.351 [2024-10-04 08:26:56.982113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:70ff67e0 cdw11:7c29ba33 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:56.982138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.351 #66 NEW cov: 11811 ft: 15125 corp: 43/945b lim: 40 exec/s: 66 rss: 69Mb L: 9/40 MS: 1 CMP- DE: "\377g\340|)\2723\360"- 00:09:04.351 [2024-10-04 08:26:57.022700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:60318484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:57.022726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:57.022786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:57.022800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:57.022861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:57.022874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:57.022912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:57.022925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.351 [2024-10-04 08:26:57.022985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:84848484 cdw11:84c43131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.351 [2024-10-04 08:26:57.022999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:04.612 #67 NEW cov: 11811 ft: 15165 corp: 44/985b lim: 40 exec/s: 67 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:09:04.612 [2024-10-04 08:26:57.073088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:60848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.612 [2024-10-04 08:26:57.073114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.612 [2024-10-04 08:26:57.073175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.612 [2024-10-04 08:26:57.073193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.612 [2024-10-04 08:26:57.073253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84010202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.612 [2024-10-04 08:26:57.073267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.612 [2024-10-04 08:26:57.073329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.612 [2024-10-04 08:26:57.073346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.612 [2024-10-04 08:26:57.073404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:84848484 cdw11:84843131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:04.612 [2024-10-04 08:26:57.073419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:04.612 #68 NEW cov: 11811 ft: 15263 corp: 45/1025b lim: 40 exec/s: 34 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:09:04.612 #68 DONE cov: 11811 ft: 15263 corp: 45/1025b lim: 40 exec/s: 34 rss: 69Mb 00:09:04.612 ###### Recommended dictionary. ###### 00:09:04.612 "\377\377\377\377\377\377\377l" # Uses: 0 00:09:04.612 "\001\000\000\000" # Uses: 1 00:09:04.612 "\001\000\002\000" # Uses: 1 00:09:04.612 "\377\377\377\001" # Uses: 0 00:09:04.612 "\377g\340|)\2723\360" # Uses: 0 00:09:04.612 ###### End of recommended dictionary. ###### 00:09:04.612 Done 68 runs in 2 second(s) 00:09:04.612 08:26:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:09:04.612 08:26:57 -- ../common.sh@72 -- # (( i++ )) 00:09:04.612 08:26:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:04.612 08:26:57 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:09:04.612 08:26:57 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:09:04.612 08:26:57 -- nvmf/run.sh@24 -- # local timen=1 00:09:04.612 08:26:57 -- nvmf/run.sh@25 -- # local core=0x1 00:09:04.612 08:26:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:04.612 08:26:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:09:04.612 08:26:57 -- nvmf/run.sh@29 -- # printf %02d 11 00:09:04.612 08:26:57 -- nvmf/run.sh@29 -- # port=4411 00:09:04.612 08:26:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:04.612 08:26:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:09:04.612 08:26:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:04.612 08:26:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:09:04.612 [2024-10-04 08:26:57.256070] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:04.612 [2024-10-04 08:26:57.256140] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1016916 ] 00:09:04.612 EAL: No free 2048 kB hugepages reported on node 1 00:09:04.871 [2024-10-04 08:26:57.438454] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.872 [2024-10-04 08:26:57.457616] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:04.872 [2024-10-04 08:26:57.457743] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.872 [2024-10-04 08:26:57.509298] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:04.872 [2024-10-04 08:26:57.525668] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:09:04.872 INFO: Running with entropic power schedule (0xFF, 100). 00:09:04.872 INFO: Seed: 4247759071 00:09:05.131 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:05.131 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:05.131 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:05.131 INFO: A corpus is not provided, starting from an empty corpus 00:09:05.131 #2 INITED exec/s: 0 rss: 60Mb 00:09:05.131 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:05.131 This may also happen if the target rejected all inputs we tried so far 00:09:05.131 [2024-10-04 08:26:57.591515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.131 [2024-10-04 08:26:57.591552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.391 NEW_FUNC[1/668]: 0x45ffb8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:09:05.391 NEW_FUNC[2/668]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:05.391 #17 NEW cov: 11570 ft: 11573 corp: 2/11b lim: 40 exec/s: 0 rss: 67Mb L: 10/10 MS: 5 ChangeBit-ChangeByte-InsertByte-ChangeByte-CMP- DE: "p\000\000\000\000\000\000\000"- 00:09:05.391 [2024-10-04 08:26:57.912619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.391 [2024-10-04 08:26:57.912660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.391 [2024-10-04 08:26:57.912787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.391 [2024-10-04 08:26:57.912802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.391 NEW_FUNC[1/3]: 0x1537a68 in nvme_ctrlr_process_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3790 00:09:05.391 NEW_FUNC[2/3]: 0x1707538 in spdk_nvme_probe_poll_async /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme.c:1507 00:09:05.391 #20 NEW cov: 11709 ft: 12880 corp: 3/33b lim: 40 exec/s: 0 rss: 67Mb L: 22/22 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:09:05.391 [2024-10-04 08:26:57.962725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.391 [2024-10-04 08:26:57.962758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.391 [2024-10-04 08:26:57.962880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.391 [2024-10-04 08:26:57.962896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.391 #26 NEW cov: 11715 ft: 13044 corp: 4/55b lim: 40 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 CrossOver- 00:09:05.391 [2024-10-04 08:26:58.013124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e2f006f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.391 [2024-10-04 08:26:58.013156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.391 [2024-10-04 08:26:58.013306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.391 [2024-10-04 08:26:58.013323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.391 [2024-10-04 08:26:58.013455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.391 [2024-10-04 08:26:58.013470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:05.392 #29 NEW cov: 11800 ft: 13515 corp: 5/79b lim: 40 exec/s: 0 rss: 67Mb L: 24/24 MS: 3 InsertByte-ChangeBit-CrossOver- 00:09:05.392 [2024-10-04 08:26:58.053210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.392 [2024-10-04 08:26:58.053242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.392 [2024-10-04 08:26:58.053366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.392 [2024-10-04 08:26:58.053383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.392 [2024-10-04 08:26:58.053496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:70000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.392 [2024-10-04 08:26:58.053513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:05.651 #30 NEW cov: 11800 ft: 13708 corp: 6/109b lim: 40 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 PersAutoDict- DE: "p\000\000\000\000\000\000\000"- 00:09:05.651 [2024-10-04 08:26:58.103085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.103113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.651 [2024-10-04 08:26:58.103257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.103274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.651 #31 NEW cov: 11800 ft: 13771 corp: 7/127b lim: 40 exec/s: 0 rss: 68Mb L: 18/30 MS: 1 EraseBytes- 00:09:05.651 [2024-10-04 08:26:58.142891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.142919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.651 #32 NEW cov: 11800 ft: 13847 corp: 8/142b lim: 40 exec/s: 0 rss: 68Mb L: 15/30 MS: 1 CopyPart- 00:09:05.651 [2024-10-04 08:26:58.182964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.182992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.651 #33 NEW cov: 11800 ft: 13984 corp: 9/157b lim: 40 exec/s: 0 rss: 68Mb L: 15/30 MS: 1 ChangeBinInt- 00:09:05.651 [2024-10-04 08:26:58.223679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.223707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.651 [2024-10-04 08:26:58.223842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6e6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.223858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.651 [2024-10-04 08:26:58.223991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:70000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.224007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:05.651 #34 NEW cov: 11800 ft: 14004 corp: 10/187b lim: 40 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 ChangeBit- 00:09:05.651 [2024-10-04 08:26:58.264098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.264126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.651 [2024-10-04 08:26:58.264237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:1818186f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.264253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.651 [2024-10-04 08:26:58.264391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.264406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:05.651 [2024-10-04 08:26:58.264534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:6f6f7000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.264552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:05.651 #35 NEW cov: 11800 ft: 14370 corp: 11/223b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:09:05.651 [2024-10-04 08:26:58.303397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:70760000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.651 [2024-10-04 08:26:58.303425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.651 #36 NEW cov: 11800 ft: 14413 corp: 12/233b lim: 40 exec/s: 0 rss: 68Mb L: 10/36 MS: 1 ShuffleBytes- 00:09:05.910 [2024-10-04 08:26:58.344079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e2f006f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.910 [2024-10-04 08:26:58.344108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.910 [2024-10-04 08:26:58.344253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6fba cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.910 [2024-10-04 08:26:58.344271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.910 [2024-10-04 08:26:58.344406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.910 [2024-10-04 08:26:58.344423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:05.910 #37 NEW cov: 11800 ft: 14443 corp: 13/258b lim: 40 exec/s: 0 rss: 68Mb L: 25/36 MS: 1 InsertByte- 00:09:05.910 [2024-10-04 08:26:58.383601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:70760080 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.910 [2024-10-04 08:26:58.383630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.910 #38 NEW cov: 11800 ft: 14480 corp: 14/268b lim: 40 exec/s: 0 rss: 68Mb L: 10/36 MS: 1 ChangeBit- 00:09:05.910 [2024-10-04 08:26:58.423805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08700000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.910 [2024-10-04 08:26:58.423832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.910 #43 NEW cov: 11800 ft: 14510 corp: 15/279b lim: 40 exec/s: 0 rss: 68Mb L: 11/36 MS: 5 ChangeBit-InsertByte-InsertByte-ShuffleBytes-PersAutoDict- DE: "p\000\000\000\000\000\000\000"- 00:09:05.910 [2024-10-04 08:26:58.464720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.910 [2024-10-04 08:26:58.464747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.910 [2024-10-04 08:26:58.464892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:1818186f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.910 [2024-10-04 08:26:58.464914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.910 [2024-10-04 08:26:58.465045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.465061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:05.911 [2024-10-04 08:26:58.465195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:6f6f7000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.465212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:05.911 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:05.911 #44 NEW cov: 11823 ft: 14564 corp: 16/318b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:09:05.911 [2024-10-04 08:26:58.514919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.514949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.911 [2024-10-04 08:26:58.515081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.515097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.911 [2024-10-04 08:26:58.515235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:70000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.515253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:05.911 [2024-10-04 08:26:58.515386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.515401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:05.911 #45 NEW cov: 11823 ft: 14600 corp: 17/356b lim: 40 exec/s: 0 rss: 68Mb L: 38/39 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\004"- 00:09:05.911 [2024-10-04 08:26:58.554680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000006f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.554708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.911 [2024-10-04 08:26:58.554840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6fba cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.554857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.911 [2024-10-04 08:26:58.554987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.911 [2024-10-04 08:26:58.555003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:05.911 #46 NEW cov: 11823 ft: 14622 corp: 18/381b lim: 40 exec/s: 46 rss: 68Mb L: 25/39 MS: 1 CMP- DE: "\000\000"- 00:09:06.170 [2024-10-04 08:26:58.594876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000006f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.594903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.595039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6fba cdw11:6f6f0100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.595054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.595178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00046f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.595196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.170 #47 NEW cov: 11823 ft: 14643 corp: 19/406b lim: 40 exec/s: 47 rss: 68Mb L: 25/39 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\004"- 00:09:06.170 [2024-10-04 08:26:58.634711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f767000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.634737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.634870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00767000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.634885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.170 #48 NEW cov: 11823 ft: 14657 corp: 20/426b lim: 40 exec/s: 48 rss: 68Mb L: 20/39 MS: 1 CrossOver- 00:09:06.170 [2024-10-04 08:26:58.674618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:19700000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.674647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.170 #49 NEW cov: 11823 ft: 14698 corp: 21/441b lim: 40 exec/s: 49 rss: 69Mb L: 15/39 MS: 1 ChangeByte- 00:09:06.170 [2024-10-04 08:26:58.715257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.715294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.715417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6e6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.715433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.715556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:f5ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.715573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.170 #50 NEW cov: 11823 ft: 14738 corp: 22/471b lim: 40 exec/s: 50 rss: 69Mb L: 30/39 MS: 1 CMP- DE: "\365\377\377\377"- 00:09:06.170 [2024-10-04 08:26:58.755560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.755588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.755728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.755745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.755883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.755905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.756047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.756063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.170 #51 NEW cov: 11823 ft: 14754 corp: 23/508b lim: 40 exec/s: 51 rss: 69Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:09:06.170 [2024-10-04 08:26:58.795708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.795737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.795863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.795879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.796015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48484c48 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.796032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.796171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.796192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.170 #52 NEW cov: 11823 ft: 14757 corp: 24/545b lim: 40 exec/s: 52 rss: 69Mb L: 37/39 MS: 1 ChangeBit- 00:09:06.170 [2024-10-04 08:26:58.846224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:3b3b3b3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.846253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.846388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3b3b3b3b cdw11:3b3b3b3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.846405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.846533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3b3b3b3b cdw11:3b3b3b3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.846553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.846676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3b3b6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.846694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.170 [2024-10-04 08:26:58.846824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.170 [2024-10-04 08:26:58.846842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:06.429 #53 NEW cov: 11823 ft: 14851 corp: 25/585b lim: 40 exec/s: 53 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:09:06.429 [2024-10-04 08:26:58.896115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f181818 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.429 [2024-10-04 08:26:58.896146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.429 [2024-10-04 08:26:58.896256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:1818186f cdw11:6f6f9190 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.429 [2024-10-04 08:26:58.896273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.429 [2024-10-04 08:26:58.896410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90936f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.429 [2024-10-04 08:26:58.896426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.429 [2024-10-04 08:26:58.896550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:6f6f7000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.429 [2024-10-04 08:26:58.896567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.429 #54 NEW cov: 11823 ft: 14868 corp: 26/621b lim: 40 exec/s: 54 rss: 69Mb L: 36/40 MS: 1 ChangeBinInt- 00:09:06.430 [2024-10-04 08:26:58.935404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:58.935432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.430 #55 NEW cov: 11823 ft: 14909 corp: 27/632b lim: 40 exec/s: 55 rss: 69Mb L: 11/40 MS: 1 InsertByte- 00:09:06.430 [2024-10-04 08:26:58.975786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76000070 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:58.975815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.430 [2024-10-04 08:26:58.975944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00007670 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:58.975961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.430 #56 NEW cov: 11823 ft: 14923 corp: 28/649b lim: 40 exec/s: 56 rss: 69Mb L: 17/40 MS: 1 PersAutoDict- DE: "\000\000"- 00:09:06.430 [2024-10-04 08:26:59.015651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:59.015680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.430 #57 NEW cov: 11823 ft: 14940 corp: 29/659b lim: 40 exec/s: 57 rss: 69Mb L: 10/40 MS: 1 ChangeBinInt- 00:09:06.430 [2024-10-04 08:26:59.056708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:006f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:59.056736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.430 [2024-10-04 08:26:59.056860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:59.056878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.430 [2024-10-04 08:26:59.056995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:6f6e6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:59.057012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.430 [2024-10-04 08:26:59.057148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:6f6f6ff5 cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:59.057163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.430 #58 NEW cov: 11823 ft: 15011 corp: 30/696b lim: 40 exec/s: 58 rss: 69Mb L: 37/40 MS: 1 CrossOver- 00:09:06.430 [2024-10-04 08:26:59.105913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.430 [2024-10-04 08:26:59.105941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.689 #59 NEW cov: 11823 ft: 15024 corp: 31/706b lim: 40 exec/s: 59 rss: 69Mb L: 10/40 MS: 1 ChangeBinInt- 00:09:06.689 [2024-10-04 08:26:59.146788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a767000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.146817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.689 [2024-10-04 08:26:59.146950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00004848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.146967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.689 [2024-10-04 08:26:59.147082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:4848484c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.147100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.689 [2024-10-04 08:26:59.147225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.147242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.689 #60 NEW cov: 11823 ft: 15049 corp: 32/744b lim: 40 exec/s: 60 rss: 69Mb L: 38/40 MS: 1 InsertByte- 00:09:06.689 [2024-10-04 08:26:59.196704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e2f006f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.196734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.689 [2024-10-04 08:26:59.196862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.196879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.689 [2024-10-04 08:26:59.197008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.197025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.689 #61 NEW cov: 11823 ft: 15095 corp: 33/768b lim: 40 exec/s: 61 rss: 69Mb L: 24/40 MS: 1 ShuffleBytes- 00:09:06.689 [2024-10-04 08:26:59.236222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76197000 cdw11:00000006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.236251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.689 #62 NEW cov: 11823 ft: 15106 corp: 34/778b lim: 40 exec/s: 62 rss: 69Mb L: 10/40 MS: 1 CrossOver- 00:09:06.689 [2024-10-04 08:26:59.277002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e2f006f cdw11:ed6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.277033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.689 [2024-10-04 08:26:59.277173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.277194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.689 [2024-10-04 08:26:59.277312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.277330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.689 #63 NEW cov: 11823 ft: 15117 corp: 35/802b lim: 40 exec/s: 63 rss: 69Mb L: 24/40 MS: 1 ChangeByte- 00:09:06.689 [2024-10-04 08:26:59.326601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.326629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.689 #64 NEW cov: 11823 ft: 15142 corp: 36/814b lim: 40 exec/s: 64 rss: 69Mb L: 12/40 MS: 1 InsertRepeatedBytes- 00:09:06.689 [2024-10-04 08:26:59.366999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76700000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.367031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.689 [2024-10-04 08:26:59.367159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:00000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.689 [2024-10-04 08:26:59.367175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.949 #65 NEW cov: 11823 ft: 15156 corp: 37/832b lim: 40 exec/s: 65 rss: 69Mb L: 18/40 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\004"- 00:09:06.949 [2024-10-04 08:26:59.407342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e2f006f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.407370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.949 [2024-10-04 08:26:59.407513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.407530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.949 [2024-10-04 08:26:59.407667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:6f6f6fff cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.407684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.949 #66 NEW cov: 11823 ft: 15167 corp: 38/856b lim: 40 exec/s: 66 rss: 69Mb L: 24/40 MS: 1 ChangeByte- 00:09:06.949 [2024-10-04 08:26:59.447731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a767000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.447757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.949 [2024-10-04 08:26:59.447882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000048f2 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.447896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.949 [2024-10-04 08:26:59.448036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:4848484c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.448053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.949 [2024-10-04 08:26:59.448190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.448206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:06.949 #67 NEW cov: 11823 ft: 15170 corp: 39/894b lim: 40 exec/s: 67 rss: 69Mb L: 38/40 MS: 1 ChangeByte- 00:09:06.949 [2024-10-04 08:26:59.497320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:76000070 cdw11:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.497347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.949 [2024-10-04 08:26:59.497479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000076 cdw11:70000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.497495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.949 #68 NEW cov: 11823 ft: 15184 corp: 40/912b lim: 40 exec/s: 68 rss: 69Mb L: 18/40 MS: 1 InsertByte- 00:09:06.949 [2024-10-04 08:26:59.547306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:19706f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:06.949 [2024-10-04 08:26:59.547336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.949 #69 NEW cov: 11823 ft: 15193 corp: 41/927b lim: 40 exec/s: 34 rss: 70Mb L: 15/40 MS: 1 CrossOver- 00:09:06.949 #69 DONE cov: 11823 ft: 15193 corp: 41/927b lim: 40 exec/s: 34 rss: 70Mb 00:09:06.949 ###### Recommended dictionary. ###### 00:09:06.949 "p\000\000\000\000\000\000\000" # Uses: 2 00:09:06.949 "\001\000\000\000\000\000\000\004" # Uses: 2 00:09:06.949 "\000\000" # Uses: 1 00:09:06.949 "\365\377\377\377" # Uses: 0 00:09:06.949 ###### End of recommended dictionary. ###### 00:09:06.949 Done 69 runs in 2 second(s) 00:09:07.208 08:26:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:09:07.208 08:26:59 -- ../common.sh@72 -- # (( i++ )) 00:09:07.208 08:26:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:07.208 08:26:59 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:09:07.208 08:26:59 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:09:07.208 08:26:59 -- nvmf/run.sh@24 -- # local timen=1 00:09:07.208 08:26:59 -- nvmf/run.sh@25 -- # local core=0x1 00:09:07.208 08:26:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:07.208 08:26:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:09:07.208 08:26:59 -- nvmf/run.sh@29 -- # printf %02d 12 00:09:07.208 08:26:59 -- nvmf/run.sh@29 -- # port=4412 00:09:07.208 08:26:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:07.208 08:26:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:09:07.208 08:26:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:07.208 08:26:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:09:07.208 [2024-10-04 08:26:59.731568] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:07.208 [2024-10-04 08:26:59.731665] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1017423 ] 00:09:07.208 EAL: No free 2048 kB hugepages reported on node 1 00:09:07.467 [2024-10-04 08:26:59.907835] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.467 [2024-10-04 08:26:59.926794] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:07.467 [2024-10-04 08:26:59.926912] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.467 [2024-10-04 08:26:59.978095] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:07.467 [2024-10-04 08:26:59.994398] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:09:07.467 INFO: Running with entropic power schedule (0xFF, 100). 00:09:07.467 INFO: Seed: 2421795116 00:09:07.467 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:07.467 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:07.467 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:07.467 INFO: A corpus is not provided, starting from an empty corpus 00:09:07.467 #2 INITED exec/s: 0 rss: 60Mb 00:09:07.467 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:07.467 This may also happen if the target rejected all inputs we tried so far 00:09:07.467 [2024-10-04 08:27:00.039939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.467 [2024-10-04 08:27:00.039970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.467 [2024-10-04 08:27:00.040029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.467 [2024-10-04 08:27:00.040044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.467 [2024-10-04 08:27:00.040099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.467 [2024-10-04 08:27:00.040113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.726 NEW_FUNC[1/670]: 0x461d28 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:09:07.726 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:07.726 #4 NEW cov: 11589 ft: 11583 corp: 2/30b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:09:07.726 [2024-10-04 08:27:00.360746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0e0e5d5d cdw11:5d5d5d5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.726 [2024-10-04 08:27:00.360782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.726 [2024-10-04 08:27:00.360842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:5d5d5d5d cdw11:5d5d5d5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.726 [2024-10-04 08:27:00.360858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.726 [2024-10-04 08:27:00.360916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:5d5d5d5d cdw11:5d5d5d5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.726 [2024-10-04 08:27:00.360931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.726 NEW_FUNC[1/1]: 0x16c2448 in _nvme_qpair_complete_abort_queued_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:593 00:09:07.726 #7 NEW cov: 11707 ft: 12111 corp: 3/58b lim: 40 exec/s: 0 rss: 67Mb L: 28/29 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:09:07.726 [2024-10-04 08:27:00.400628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.726 [2024-10-04 08:27:00.400656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.726 [2024-10-04 08:27:00.400713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.726 [2024-10-04 08:27:00.400728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.985 #8 NEW cov: 11713 ft: 12564 corp: 4/81b lim: 40 exec/s: 0 rss: 67Mb L: 23/29 MS: 1 EraseBytes- 00:09:07.985 [2024-10-04 08:27:00.440911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.440938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.440999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00040000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.441014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.441072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.441086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.985 #9 NEW cov: 11798 ft: 12818 corp: 5/110b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeBinInt- 00:09:07.985 [2024-10-04 08:27:00.481044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.481072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.481131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.481146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.481204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.481218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.985 #10 NEW cov: 11798 ft: 12864 corp: 6/139b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeBinInt- 00:09:07.985 [2024-10-04 08:27:00.521286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.521312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.521369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.521384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.521441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.521458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.521512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575700 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.521527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:07.985 #11 NEW cov: 11798 ft: 13261 corp: 7/174b lim: 40 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:09:07.985 [2024-10-04 08:27:00.561272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.561300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.561359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.561374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.561431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:000000b1 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.561446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.985 #12 NEW cov: 11798 ft: 13419 corp: 8/198b lim: 40 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 InsertByte- 00:09:07.985 [2024-10-04 08:27:00.601358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.601385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.985 [2024-10-04 08:27:00.601444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.985 [2024-10-04 08:27:00.601458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.986 [2024-10-04 08:27:00.601516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.986 [2024-10-04 08:27:00.601531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.986 #13 NEW cov: 11798 ft: 13486 corp: 9/227b lim: 40 exec/s: 0 rss: 68Mb L: 29/35 MS: 1 ChangeBinInt- 00:09:07.986 [2024-10-04 08:27:00.641468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.986 [2024-10-04 08:27:00.641496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.986 [2024-10-04 08:27:00.641554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.986 [2024-10-04 08:27:00.641569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.986 [2024-10-04 08:27:00.641626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:07.986 [2024-10-04 08:27:00.641641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.986 #14 NEW cov: 11798 ft: 13538 corp: 10/256b lim: 40 exec/s: 0 rss: 68Mb L: 29/35 MS: 1 ChangeBinInt- 00:09:08.245 [2024-10-04 08:27:00.681561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.681591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.681650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.681665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.681720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.681734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.245 #15 NEW cov: 11798 ft: 13579 corp: 11/285b lim: 40 exec/s: 0 rss: 68Mb L: 29/35 MS: 1 ChangeBit- 00:09:08.245 [2024-10-04 08:27:00.721724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.721750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.721808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.721822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.721881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.721895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.245 #16 NEW cov: 11798 ft: 13669 corp: 12/314b lim: 40 exec/s: 0 rss: 68Mb L: 29/35 MS: 1 ChangeBinInt- 00:09:08.245 [2024-10-04 08:27:00.761785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00370000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.761811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.761869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.761883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.761938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.761953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.245 #17 NEW cov: 11798 ft: 13770 corp: 13/338b lim: 40 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 InsertByte- 00:09:08.245 [2024-10-04 08:27:00.801718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.801744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.801800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.801814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.245 #18 NEW cov: 11798 ft: 13781 corp: 14/361b lim: 40 exec/s: 0 rss: 68Mb L: 23/35 MS: 1 CopyPart- 00:09:08.245 [2024-10-04 08:27:00.842027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.842054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.842112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.842127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.842185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.842203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.245 #19 NEW cov: 11798 ft: 13793 corp: 15/390b lim: 40 exec/s: 0 rss: 68Mb L: 29/35 MS: 1 ChangeBit- 00:09:08.245 [2024-10-04 08:27:00.882256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.882281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.882337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00040000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.882352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.882404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.882418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.882471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.882485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:08.245 #20 NEW cov: 11798 ft: 13812 corp: 16/423b lim: 40 exec/s: 0 rss: 68Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:09:08.245 [2024-10-04 08:27:00.922100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.922126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.245 [2024-10-04 08:27:00.922182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.245 [2024-10-04 08:27:00.922201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.504 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:08.504 #21 NEW cov: 11821 ft: 13848 corp: 17/446b lim: 40 exec/s: 0 rss: 68Mb L: 23/35 MS: 1 ShuffleBytes- 00:09:08.504 [2024-10-04 08:27:00.962356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.504 [2024-10-04 08:27:00.962383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.504 [2024-10-04 08:27:00.962443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.504 [2024-10-04 08:27:00.962461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.504 [2024-10-04 08:27:00.962517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.504 [2024-10-04 08:27:00.962531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.504 #22 NEW cov: 11821 ft: 13881 corp: 18/472b lim: 40 exec/s: 0 rss: 68Mb L: 26/35 MS: 1 EraseBytes- 00:09:08.504 [2024-10-04 08:27:01.002311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.504 [2024-10-04 08:27:01.002338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.504 [2024-10-04 08:27:01.002397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.002411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.505 #28 NEW cov: 11821 ft: 13895 corp: 19/491b lim: 40 exec/s: 0 rss: 68Mb L: 19/35 MS: 1 EraseBytes- 00:09:08.505 [2024-10-04 08:27:01.042590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.042616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.505 [2024-10-04 08:27:01.042675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.042690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.505 [2024-10-04 08:27:01.042749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0078c233 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.042763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.505 #29 NEW cov: 11821 ft: 13917 corp: 20/520b lim: 40 exec/s: 29 rss: 68Mb L: 29/35 MS: 1 CMP- DE: "x\3023\204~\340h\000"- 00:09:08.505 [2024-10-04 08:27:01.082361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.082386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.505 #30 NEW cov: 11821 ft: 14650 corp: 21/535b lim: 40 exec/s: 30 rss: 68Mb L: 15/35 MS: 1 CrossOver- 00:09:08.505 [2024-10-04 08:27:01.122800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:28000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.122826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.505 [2024-10-04 08:27:01.122883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.122898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.505 [2024-10-04 08:27:01.122954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.122968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.505 #31 NEW cov: 11821 ft: 14657 corp: 22/559b lim: 40 exec/s: 31 rss: 68Mb L: 24/35 MS: 1 InsertByte- 00:09:08.505 [2024-10-04 08:27:01.152921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0037e7e7 cdw11:e7e7e700 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.152947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.505 [2024-10-04 08:27:01.153007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.153021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.505 [2024-10-04 08:27:01.153079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.505 [2024-10-04 08:27:01.153094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.505 #32 NEW cov: 11821 ft: 14673 corp: 23/588b lim: 40 exec/s: 32 rss: 68Mb L: 29/35 MS: 1 InsertRepeatedBytes- 00:09:08.764 [2024-10-04 08:27:01.193234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.193260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.193329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.193344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.193389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.193403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.193457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.193470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:08.764 #33 NEW cov: 11821 ft: 14699 corp: 24/623b lim: 40 exec/s: 33 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:09:08.764 [2024-10-04 08:27:01.232963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.232988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.233044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.233058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.764 #34 NEW cov: 11821 ft: 14726 corp: 25/646b lim: 40 exec/s: 34 rss: 68Mb L: 23/35 MS: 1 EraseBytes- 00:09:08.764 [2024-10-04 08:27:01.273104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.273130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.273191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.273204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.764 #35 NEW cov: 11821 ft: 14749 corp: 26/663b lim: 40 exec/s: 35 rss: 68Mb L: 17/35 MS: 1 CrossOver- 00:09:08.764 [2024-10-04 08:27:01.313375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.313401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.313456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.313470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.313526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.313540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.764 #36 NEW cov: 11821 ft: 14771 corp: 27/692b lim: 40 exec/s: 36 rss: 68Mb L: 29/35 MS: 1 ChangeBinInt- 00:09:08.764 [2024-10-04 08:27:01.343629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.343655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.343711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.343725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.343782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.343796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.343848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.343863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:08.764 #37 NEW cov: 11821 ft: 14788 corp: 28/729b lim: 40 exec/s: 37 rss: 69Mb L: 37/37 MS: 1 CopyPart- 00:09:08.764 [2024-10-04 08:27:01.383518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0e0e5d5d cdw11:5d5d5d5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.383544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.383602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:5d5d5d00 cdw11:5d5d5d5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.383616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.383671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:5d5d5d5d cdw11:5d5d5d5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.383685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.764 #38 NEW cov: 11821 ft: 14810 corp: 29/757b lim: 40 exec/s: 38 rss: 69Mb L: 28/37 MS: 1 ChangeByte- 00:09:08.764 [2024-10-04 08:27:01.423698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.423727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.423786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.423801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.764 [2024-10-04 08:27:01.423855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:08.764 [2024-10-04 08:27:01.423869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.764 #39 NEW cov: 11821 ft: 14879 corp: 30/786b lim: 40 exec/s: 39 rss: 69Mb L: 29/37 MS: 1 CrossOver- 00:09:09.025 [2024-10-04 08:27:01.463808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.463834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.463891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.463908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.463961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.463975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.025 #40 NEW cov: 11821 ft: 14900 corp: 31/815b lim: 40 exec/s: 40 rss: 69Mb L: 29/37 MS: 1 ChangeBinInt- 00:09:09.025 [2024-10-04 08:27:01.504102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.504128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.504183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffff47 cdw11:28000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.504202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.504258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.504273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.504331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.504344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:09.025 #41 NEW cov: 11821 ft: 14926 corp: 32/847b lim: 40 exec/s: 41 rss: 69Mb L: 32/37 MS: 1 CMP- DE: "\377\377\377\377\377\377\377G"- 00:09:09.025 [2024-10-04 08:27:01.543891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.543917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.543969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffff47 cdw11:28000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.543986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.025 #42 NEW cov: 11821 ft: 14952 corp: 33/868b lim: 40 exec/s: 42 rss: 69Mb L: 21/37 MS: 1 EraseBytes- 00:09:09.025 [2024-10-04 08:27:01.584044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.584071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.584126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.584141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.025 #43 NEW cov: 11821 ft: 15014 corp: 34/888b lim: 40 exec/s: 43 rss: 69Mb L: 20/37 MS: 1 EraseBytes- 00:09:09.025 [2024-10-04 08:27:01.624273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.624299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.624356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.624370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.624426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.624440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.025 #44 NEW cov: 11821 ft: 15023 corp: 35/914b lim: 40 exec/s: 44 rss: 69Mb L: 26/37 MS: 1 ShuffleBytes- 00:09:09.025 [2024-10-04 08:27:01.654192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00007a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.654217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.654272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.654286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.025 #45 NEW cov: 11821 ft: 15028 corp: 36/930b lim: 40 exec/s: 45 rss: 69Mb L: 16/37 MS: 1 InsertByte- 00:09:09.025 [2024-10-04 08:27:01.694499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.694525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.694582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.694596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.025 [2024-10-04 08:27:01.694651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.025 [2024-10-04 08:27:01.694665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.284 #46 NEW cov: 11821 ft: 15051 corp: 37/956b lim: 40 exec/s: 46 rss: 69Mb L: 26/37 MS: 1 ShuffleBytes- 00:09:09.284 [2024-10-04 08:27:01.734827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.734854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.734908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.734922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.734974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.734988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.735043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff47 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.735057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:09.284 #47 NEW cov: 11821 ft: 15070 corp: 38/993b lim: 40 exec/s: 47 rss: 70Mb L: 37/37 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377G"- 00:09:09.284 [2024-10-04 08:27:01.774584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0000001d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.774611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.774666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.774680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.284 #48 NEW cov: 11821 ft: 15079 corp: 39/1010b lim: 40 exec/s: 48 rss: 70Mb L: 17/37 MS: 1 ShuffleBytes- 00:09:09.284 [2024-10-04 08:27:01.814563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0e0e5d5d cdw11:5d5d5d5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.814590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.284 #49 NEW cov: 11821 ft: 15136 corp: 40/1024b lim: 40 exec/s: 49 rss: 70Mb L: 14/37 MS: 1 EraseBytes- 00:09:09.284 [2024-10-04 08:27:01.855021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0037e7e7 cdw11:e7e7e700 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.855048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.855106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.855120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.855175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.855193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.284 #50 NEW cov: 11821 ft: 15151 corp: 41/1053b lim: 40 exec/s: 50 rss: 70Mb L: 29/37 MS: 1 CopyPart- 00:09:09.284 [2024-10-04 08:27:01.895202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0e0e5d5d cdw11:00000900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.895232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.895298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.895312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.895367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.895380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.895432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:5d5d5d5d cdw11:5d5d5d5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.895445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:09.284 #51 NEW cov: 11821 ft: 15170 corp: 42/1092b lim: 40 exec/s: 51 rss: 70Mb L: 39/39 MS: 1 CrossOver- 00:09:09.284 [2024-10-04 08:27:01.935221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.935248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.935307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.935320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.284 [2024-10-04 08:27:01.935380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.284 [2024-10-04 08:27:01.935394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.284 #52 NEW cov: 11821 ft: 15216 corp: 43/1121b lim: 40 exec/s: 52 rss: 70Mb L: 29/39 MS: 1 ChangeBit- 00:09:09.543 [2024-10-04 08:27:01.975451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.543 [2024-10-04 08:27:01.975477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.543 [2024-10-04 08:27:01.975534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.543 [2024-10-04 08:27:01.975548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.543 [2024-10-04 08:27:01.975604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.543 [2024-10-04 08:27:01.975618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.543 [2024-10-04 08:27:01.975671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.543 [2024-10-04 08:27:01.975685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:09.543 #53 NEW cov: 11821 ft: 15255 corp: 44/1158b lim: 40 exec/s: 53 rss: 70Mb L: 37/39 MS: 1 CrossOver- 00:09:09.543 [2024-10-04 08:27:02.015458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:002100ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.543 [2024-10-04 08:27:02.015487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.543 [2024-10-04 08:27:02.015545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.543 [2024-10-04 08:27:02.015559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.543 [2024-10-04 08:27:02.015615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.543 [2024-10-04 08:27:02.015629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.544 #54 NEW cov: 11821 ft: 15256 corp: 45/1187b lim: 40 exec/s: 27 rss: 70Mb L: 29/39 MS: 1 ChangeByte- 00:09:09.544 #54 DONE cov: 11821 ft: 15256 corp: 45/1187b lim: 40 exec/s: 27 rss: 70Mb 00:09:09.544 ###### Recommended dictionary. ###### 00:09:09.544 "x\3023\204~\340h\000" # Uses: 0 00:09:09.544 "\377\377\377\377\377\377\377G" # Uses: 1 00:09:09.544 ###### End of recommended dictionary. ###### 00:09:09.544 Done 54 runs in 2 second(s) 00:09:09.544 08:27:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:09:09.544 08:27:02 -- ../common.sh@72 -- # (( i++ )) 00:09:09.544 08:27:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:09.544 08:27:02 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:09:09.544 08:27:02 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:09:09.544 08:27:02 -- nvmf/run.sh@24 -- # local timen=1 00:09:09.544 08:27:02 -- nvmf/run.sh@25 -- # local core=0x1 00:09:09.544 08:27:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:09.544 08:27:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:09:09.544 08:27:02 -- nvmf/run.sh@29 -- # printf %02d 13 00:09:09.544 08:27:02 -- nvmf/run.sh@29 -- # port=4413 00:09:09.544 08:27:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:09.544 08:27:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:09:09.544 08:27:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:09.544 08:27:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:09:09.544 [2024-10-04 08:27:02.198801] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:09.544 [2024-10-04 08:27:02.198896] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1018022 ] 00:09:09.802 EAL: No free 2048 kB hugepages reported on node 1 00:09:09.802 [2024-10-04 08:27:02.384046] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.802 [2024-10-04 08:27:02.403458] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:09.802 [2024-10-04 08:27:02.403583] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.802 [2024-10-04 08:27:02.455079] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:09.802 [2024-10-04 08:27:02.471457] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:09:10.093 INFO: Running with entropic power schedule (0xFF, 100). 00:09:10.093 INFO: Seed: 602914187 00:09:10.093 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:10.093 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:10.093 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:10.093 INFO: A corpus is not provided, starting from an empty corpus 00:09:10.093 #2 INITED exec/s: 0 rss: 59Mb 00:09:10.093 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:10.093 This may also happen if the target rejected all inputs we tried so far 00:09:10.093 [2024-10-04 08:27:02.530868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.093 [2024-10-04 08:27:02.530900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.093 [2024-10-04 08:27:02.530957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.093 [2024-10-04 08:27:02.530971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.093 [2024-10-04 08:27:02.531027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.093 [2024-10-04 08:27:02.531041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.387 NEW_FUNC[1/670]: 0x4638f8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:09:10.387 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:10.387 #5 NEW cov: 11582 ft: 11563 corp: 2/31b lim: 40 exec/s: 0 rss: 66Mb L: 30/30 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:09:10.387 [2024-10-04 08:27:02.841593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a190a0a cdw11:190a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:02.841657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.387 #9 NEW cov: 11695 ft: 12615 corp: 3/45b lim: 40 exec/s: 0 rss: 68Mb L: 14/30 MS: 4 CrossOver-CMP-CrossOver-CopyPart- DE: "\031\000\000\000"- 00:09:10.387 [2024-10-04 08:27:02.891635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:02.891663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.387 [2024-10-04 08:27:02.891717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:02.891732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.387 [2024-10-04 08:27:02.891789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:02.891802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.387 #10 NEW cov: 11701 ft: 12843 corp: 4/76b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertByte- 00:09:10.387 [2024-10-04 08:27:02.931529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:290affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:02.931557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.387 #13 NEW cov: 11786 ft: 13177 corp: 5/86b lim: 40 exec/s: 0 rss: 68Mb L: 10/31 MS: 3 ChangeByte-CopyPart-CrossOver- 00:09:10.387 [2024-10-04 08:27:02.971851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:02.971881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.387 [2024-10-04 08:27:02.971938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:02.971952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.387 [2024-10-04 08:27:02.972003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:02.972017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.387 #14 NEW cov: 11786 ft: 13209 corp: 6/117b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeBinInt- 00:09:10.387 [2024-10-04 08:27:03.012015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:03.012042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.387 [2024-10-04 08:27:03.012099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:03.012113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.387 [2024-10-04 08:27:03.012168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff40ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:03.012182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.387 #15 NEW cov: 11786 ft: 13282 corp: 7/148b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeByte- 00:09:10.387 [2024-10-04 08:27:03.052109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:03.052135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.387 [2024-10-04 08:27:03.052199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:03.052214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.387 [2024-10-04 08:27:03.052268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:dfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.387 [2024-10-04 08:27:03.052282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.646 #16 NEW cov: 11786 ft: 13389 corp: 8/178b lim: 40 exec/s: 0 rss: 68Mb L: 30/31 MS: 1 ChangeBit- 00:09:10.646 [2024-10-04 08:27:03.092339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.646 [2024-10-04 08:27:03.092365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.092420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.092434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.092493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:95ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.092511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.092565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.092580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.647 #17 NEW cov: 11786 ft: 13852 corp: 9/210b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 InsertByte- 00:09:10.647 [2024-10-04 08:27:03.132263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a190a0a cdw11:190a0019 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.132289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.132348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.132363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.647 #18 NEW cov: 11786 ft: 14074 corp: 10/228b lim: 40 exec/s: 0 rss: 68Mb L: 18/32 MS: 1 PersAutoDict- DE: "\031\000\000\000"- 00:09:10.647 [2024-10-04 08:27:03.172232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:290affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.172259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.647 #19 NEW cov: 11786 ft: 14120 corp: 11/242b lim: 40 exec/s: 0 rss: 68Mb L: 14/32 MS: 1 PersAutoDict- DE: "\031\000\000\000"- 00:09:10.647 [2024-10-04 08:27:03.212730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.212756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.212812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:19000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.212827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.212884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:3dff0bff cdw11:95ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.212898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.212953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.212967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.647 #20 NEW cov: 11786 ft: 14155 corp: 12/278b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 PersAutoDict- DE: "\031\000\000\000"- 00:09:10.647 [2024-10-04 08:27:03.252739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.252765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.252822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:f7ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.252840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.252893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.252907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.647 #26 NEW cov: 11786 ft: 14179 corp: 13/308b lim: 40 exec/s: 0 rss: 68Mb L: 30/36 MS: 1 ChangeBit- 00:09:10.647 [2024-10-04 08:27:03.292681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a190a0a cdw11:190a0019 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.292708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.647 [2024-10-04 08:27:03.292765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00f6ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.647 [2024-10-04 08:27:03.292779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.647 #27 NEW cov: 11786 ft: 14262 corp: 14/326b lim: 40 exec/s: 0 rss: 68Mb L: 18/36 MS: 1 ChangeBinInt- 00:09:10.906 [2024-10-04 08:27:03.333234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.906 [2024-10-04 08:27:03.333261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.906 [2024-10-04 08:27:03.333318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff190000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.906 [2024-10-04 08:27:03.333332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.906 [2024-10-04 08:27:03.333391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:19000000 cdw11:3dff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.906 [2024-10-04 08:27:03.333404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.906 [2024-10-04 08:27:03.333459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:95ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.906 [2024-10-04 08:27:03.333473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.906 [2024-10-04 08:27:03.333531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.906 [2024-10-04 08:27:03.333545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:10.906 #28 NEW cov: 11786 ft: 14319 corp: 15/366b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 PersAutoDict- DE: "\031\000\000\000"- 00:09:10.907 [2024-10-04 08:27:03.373159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.373192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.373250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.373265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.373321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:95ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.373339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.373382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.373396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.907 #29 NEW cov: 11786 ft: 14326 corp: 16/401b lim: 40 exec/s: 0 rss: 69Mb L: 35/40 MS: 1 CopyPart- 00:09:10.907 [2024-10-04 08:27:03.412912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:1a0a0100 cdw11:001f0100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.412939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.907 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:10.907 #33 NEW cov: 11809 ft: 14359 corp: 17/415b lim: 40 exec/s: 0 rss: 69Mb L: 14/40 MS: 4 CopyPart-ChangeBit-CMP-CMP- DE: "\001\000\000\037"-"\001\000\000\000\000\000\000\000"- 00:09:10.907 [2024-10-04 08:27:03.453009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:1a0a0100 cdw11:001f0100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.453035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.907 #34 NEW cov: 11809 ft: 14412 corp: 18/430b lim: 40 exec/s: 0 rss: 69Mb L: 15/40 MS: 1 InsertByte- 00:09:10.907 [2024-10-04 08:27:03.493638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.493664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.493719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.493733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.493785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:95ffffff cdw11:ff777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.493798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.493830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7777ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.493843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.493897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.493911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:10.907 #35 NEW cov: 11809 ft: 14442 corp: 19/470b lim: 40 exec/s: 35 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:09:10.907 [2024-10-04 08:27:03.533648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.533674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.533735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.533753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.533811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffff3dff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.533825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.533882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:40ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.533895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.907 #36 NEW cov: 11809 ft: 14455 corp: 20/507b lim: 40 exec/s: 36 rss: 69Mb L: 37/40 MS: 1 InsertRepeatedBytes- 00:09:10.907 [2024-10-04 08:27:03.573748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.573775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.573832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.573846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.573899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.573913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.907 [2024-10-04 08:27:03.573966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:10.907 [2024-10-04 08:27:03.573980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.166 #37 NEW cov: 11809 ft: 14469 corp: 21/539b lim: 40 exec/s: 37 rss: 69Mb L: 32/40 MS: 1 InsertByte- 00:09:11.166 [2024-10-04 08:27:03.613467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:1a0a0100 cdw11:001f0100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.166 [2024-10-04 08:27:03.613494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.166 #38 NEW cov: 11809 ft: 14520 corp: 22/554b lim: 40 exec/s: 38 rss: 69Mb L: 15/40 MS: 1 CrossOver- 00:09:11.166 [2024-10-04 08:27:03.654066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.654094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.167 [2024-10-04 08:27:03.654149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.654164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.167 [2024-10-04 08:27:03.654222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:95ffffff cdw11:ff777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.654236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.167 [2024-10-04 08:27:03.654295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7777ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.654309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.167 [2024-10-04 08:27:03.654364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.654377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:11.167 #39 NEW cov: 11809 ft: 14551 corp: 23/594b lim: 40 exec/s: 39 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:09:11.167 [2024-10-04 08:27:03.693846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.693873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.167 [2024-10-04 08:27:03.693930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.693944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.167 #40 NEW cov: 11809 ft: 14597 corp: 24/610b lim: 40 exec/s: 40 rss: 69Mb L: 16/40 MS: 1 EraseBytes- 00:09:11.167 [2024-10-04 08:27:03.733935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:861a0a01 cdw11:00001f01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.733962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.167 [2024-10-04 08:27:03.734019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.734033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.167 #41 NEW cov: 11809 ft: 14638 corp: 25/626b lim: 40 exec/s: 41 rss: 69Mb L: 16/40 MS: 1 InsertByte- 00:09:11.167 [2024-10-04 08:27:03.774103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:1f010000 cdw11:0000290a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.774130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.167 [2024-10-04 08:27:03.774192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.774207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.167 #42 NEW cov: 11809 ft: 14656 corp: 26/646b lim: 40 exec/s: 42 rss: 69Mb L: 20/40 MS: 1 CrossOver- 00:09:11.167 [2024-10-04 08:27:03.814176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.814208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.167 [2024-10-04 08:27:03.814269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.167 [2024-10-04 08:27:03.814283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.167 #43 NEW cov: 11809 ft: 14660 corp: 27/666b lim: 40 exec/s: 43 rss: 69Mb L: 20/40 MS: 1 EraseBytes- 00:09:11.427 [2024-10-04 08:27:03.854692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.854722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:03.854780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:03ff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.854795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:03.854849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:95ffffff cdw11:ff777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.854863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:03.854917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7777ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.854930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:03.854989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.855003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:11.427 #44 NEW cov: 11809 ft: 14669 corp: 28/706b lim: 40 exec/s: 44 rss: 70Mb L: 40/40 MS: 1 CMP- DE: "\000\000\000\000\000\000\003\377"- 00:09:11.427 [2024-10-04 08:27:03.894423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:1f010000 cdw11:0000290a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.894450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:03.894505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.894519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.427 #45 NEW cov: 11809 ft: 14680 corp: 29/726b lim: 40 exec/s: 45 rss: 70Mb L: 20/40 MS: 1 CrossOver- 00:09:11.427 [2024-10-04 08:27:03.934741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.934768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:03.934825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.934839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:03.934893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fffbffff cdw11:ffffff40 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.934907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:03.934960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.934975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.427 #46 NEW cov: 11809 ft: 14686 corp: 30/758b lim: 40 exec/s: 46 rss: 70Mb L: 32/40 MS: 1 ChangeBit- 00:09:11.427 [2024-10-04 08:27:03.974520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a190a0a cdw11:190a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:03.974547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.427 #47 NEW cov: 11809 ft: 14702 corp: 31/772b lim: 40 exec/s: 47 rss: 70Mb L: 14/40 MS: 1 CopyPart- 00:09:11.427 [2024-10-04 08:27:04.015119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.015146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:04.015205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff190000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.015221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:04.015274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:19000000 cdw11:3dff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.015288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:04.015343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:95fffff7 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.015359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:04.015414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.015428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:11.427 #48 NEW cov: 11809 ft: 14726 corp: 32/812b lim: 40 exec/s: 48 rss: 70Mb L: 40/40 MS: 1 ChangeBit- 00:09:11.427 [2024-10-04 08:27:04.055127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.055153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:04.055219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:19000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.055233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:04.055289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:3dff0bff cdw11:95ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.055302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.427 [2024-10-04 08:27:04.055359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000003ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.055372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.427 #49 NEW cov: 11809 ft: 14740 corp: 33/848b lim: 40 exec/s: 49 rss: 70Mb L: 36/40 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\003\377"- 00:09:11.427 [2024-10-04 08:27:04.094875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a190a2c cdw11:0a190a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.427 [2024-10-04 08:27:04.094905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.687 #51 NEW cov: 11809 ft: 14756 corp: 34/856b lim: 40 exec/s: 51 rss: 70Mb L: 8/40 MS: 2 EraseBytes-InsertByte- 00:09:11.687 [2024-10-04 08:27:04.135333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.135360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.687 [2024-10-04 08:27:04.135415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:19000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.135429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.687 [2024-10-04 08:27:04.135483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:3dff0bff cdw11:95ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.135497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.687 [2024-10-04 08:27:04.135551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.135564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.687 #52 NEW cov: 11809 ft: 14853 corp: 35/892b lim: 40 exec/s: 52 rss: 70Mb L: 36/40 MS: 1 ChangeBit- 00:09:11.687 [2024-10-04 08:27:04.175225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff95ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.175251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.687 [2024-10-04 08:27:04.175307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.175322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.687 #53 NEW cov: 11809 ft: 14862 corp: 36/914b lim: 40 exec/s: 53 rss: 70Mb L: 22/40 MS: 1 EraseBytes- 00:09:11.687 [2024-10-04 08:27:04.215141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a1d0a0a cdw11:190a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.215167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.687 #54 NEW cov: 11809 ft: 14954 corp: 37/928b lim: 40 exec/s: 54 rss: 70Mb L: 14/40 MS: 1 ChangeBit- 00:09:11.687 [2024-10-04 08:27:04.255666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.255692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.687 [2024-10-04 08:27:04.255749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.255764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.687 [2024-10-04 08:27:04.255819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:99ffff3d cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.255833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.687 [2024-10-04 08:27:04.255885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff40ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.255903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.687 #55 NEW cov: 11809 ft: 14990 corp: 38/966b lim: 40 exec/s: 55 rss: 70Mb L: 38/40 MS: 1 InsertByte- 00:09:11.687 [2024-10-04 08:27:04.295910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:fffffffe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.295936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.687 [2024-10-04 08:27:04.295992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dff0bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.687 [2024-10-04 08:27:04.296006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.688 [2024-10-04 08:27:04.296058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:95ffffff cdw11:ff777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.688 [2024-10-04 08:27:04.296072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.688 [2024-10-04 08:27:04.296125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7777ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.688 [2024-10-04 08:27:04.296138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.688 [2024-10-04 08:27:04.296196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.688 [2024-10-04 08:27:04.296210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:11.688 #56 NEW cov: 11809 ft: 14999 corp: 39/1006b lim: 40 exec/s: 56 rss: 70Mb L: 40/40 MS: 1 ChangeBit- 00:09:11.688 [2024-10-04 08:27:04.335782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a190a0a cdw11:190a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.688 [2024-10-04 08:27:04.335809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.688 [2024-10-04 08:27:04.335866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:000a0000 cdw11:0a190a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.688 [2024-10-04 08:27:04.335880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.688 [2024-10-04 08:27:04.335936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.688 [2024-10-04 08:27:04.335949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.688 #57 NEW cov: 11809 ft: 15001 corp: 40/1031b lim: 40 exec/s: 57 rss: 70Mb L: 25/40 MS: 1 CopyPart- 00:09:11.948 [2024-10-04 08:27:04.376058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.376085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.376141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:99ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.376155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.376219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffff3d cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.376234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.376288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff40ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.376318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.948 #58 NEW cov: 11809 ft: 15004 corp: 41/1069b lim: 40 exec/s: 58 rss: 70Mb L: 38/40 MS: 1 ShuffleBytes- 00:09:11.948 [2024-10-04 08:27:04.416053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a190a0a cdw11:190a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.416079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.416136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:000a00b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.416150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.416208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.416222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.948 #59 NEW cov: 11809 ft: 15008 corp: 42/1099b lim: 40 exec/s: 59 rss: 70Mb L: 30/40 MS: 1 InsertRepeatedBytes- 00:09:11.948 [2024-10-04 08:27:04.446265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:f9f9f9f9 cdw11:f9f9f9f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.446291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.446349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f9f9f9f9 cdw11:f9f9f90a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.446363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.446419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.446433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.446490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.446504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.948 #60 NEW cov: 11809 ft: 15018 corp: 43/1134b lim: 40 exec/s: 60 rss: 70Mb L: 35/40 MS: 1 InsertRepeatedBytes- 00:09:11.948 [2024-10-04 08:27:04.486009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:1a0a0100 cdw11:00260100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.486036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.526224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:1a0a0100 cdw11:00260100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.526254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.948 [2024-10-04 08:27:04.526313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:0000002b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:11.948 [2024-10-04 08:27:04.526328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.948 #62 NEW cov: 11809 ft: 15033 corp: 44/1150b lim: 40 exec/s: 31 rss: 70Mb L: 16/40 MS: 2 ChangeBinInt-InsertByte- 00:09:11.948 #62 DONE cov: 11809 ft: 15033 corp: 44/1150b lim: 40 exec/s: 31 rss: 70Mb 00:09:11.948 ###### Recommended dictionary. ###### 00:09:11.948 "\031\000\000\000" # Uses: 4 00:09:11.948 "\001\000\000\037" # Uses: 0 00:09:11.948 "\001\000\000\000\000\000\000\000" # Uses: 0 00:09:11.948 "\000\000\000\000\000\000\003\377" # Uses: 1 00:09:11.948 ###### End of recommended dictionary. ###### 00:09:11.948 Done 62 runs in 2 second(s) 00:09:12.208 08:27:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:09:12.208 08:27:04 -- ../common.sh@72 -- # (( i++ )) 00:09:12.208 08:27:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:12.208 08:27:04 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:09:12.208 08:27:04 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:09:12.208 08:27:04 -- nvmf/run.sh@24 -- # local timen=1 00:09:12.208 08:27:04 -- nvmf/run.sh@25 -- # local core=0x1 00:09:12.208 08:27:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:12.208 08:27:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:09:12.208 08:27:04 -- nvmf/run.sh@29 -- # printf %02d 14 00:09:12.208 08:27:04 -- nvmf/run.sh@29 -- # port=4414 00:09:12.208 08:27:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:12.208 08:27:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:09:12.208 08:27:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:12.208 08:27:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:09:12.208 [2024-10-04 08:27:04.698940] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:12.208 [2024-10-04 08:27:04.699009] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1018646 ] 00:09:12.208 EAL: No free 2048 kB hugepages reported on node 1 00:09:12.208 [2024-10-04 08:27:04.876982] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.468 [2024-10-04 08:27:04.896965] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:12.468 [2024-10-04 08:27:04.897092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.468 [2024-10-04 08:27:04.948629] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:12.468 [2024-10-04 08:27:04.964985] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:09:12.468 INFO: Running with entropic power schedule (0xFF, 100). 00:09:12.468 INFO: Seed: 3095814351 00:09:12.468 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:12.468 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:12.468 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:12.468 INFO: A corpus is not provided, starting from an empty corpus 00:09:12.468 #2 INITED exec/s: 0 rss: 59Mb 00:09:12.468 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:12.468 This may also happen if the target rejected all inputs we tried so far 00:09:12.468 [2024-10-04 08:27:05.034437] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.468 [2024-10-04 08:27:05.034481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.727 NEW_FUNC[1/671]: 0x4654c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:09:12.727 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:12.727 #7 NEW cov: 11576 ft: 11577 corp: 2/11b lim: 35 exec/s: 0 rss: 67Mb L: 10/10 MS: 5 ShuffleBytes-ChangeByte-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:09:12.727 [2024-10-04 08:27:05.365781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.727 [2024-10-04 08:27:05.365848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.727 [2024-10-04 08:27:05.366002] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.727 [2024-10-04 08:27:05.366039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.727 #13 NEW cov: 11696 ft: 12818 corp: 3/25b lim: 35 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:09:12.987 [2024-10-04 08:27:05.415350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.415387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.987 #14 NEW cov: 11702 ft: 13209 corp: 4/33b lim: 35 exec/s: 0 rss: 67Mb L: 8/14 MS: 1 EraseBytes- 00:09:12.987 [2024-10-04 08:27:05.455546] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.455574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.987 #15 NEW cov: 11787 ft: 13508 corp: 5/41b lim: 35 exec/s: 0 rss: 67Mb L: 8/14 MS: 1 ChangeBinInt- 00:09:12.987 [2024-10-04 08:27:05.495620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.495653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.987 #16 NEW cov: 11787 ft: 13562 corp: 6/51b lim: 35 exec/s: 0 rss: 67Mb L: 10/14 MS: 1 EraseBytes- 00:09:12.987 [2024-10-04 08:27:05.535713] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.535741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.987 #17 NEW cov: 11787 ft: 13693 corp: 7/59b lim: 35 exec/s: 0 rss: 67Mb L: 8/14 MS: 1 ChangeBit- 00:09:12.987 [2024-10-04 08:27:05.576129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.576157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.987 [2024-10-04 08:27:05.576287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.576305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.987 #18 NEW cov: 11787 ft: 13771 corp: 8/76b lim: 35 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 CrossOver- 00:09:12.987 [2024-10-04 08:27:05.616192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.616230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:12.987 [2024-10-04 08:27:05.616360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.616378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:12.987 #19 NEW cov: 11787 ft: 13870 corp: 9/96b lim: 35 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 CrossOver- 00:09:12.987 [2024-10-04 08:27:05.666149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.987 [2024-10-04 08:27:05.666182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.247 #20 NEW cov: 11787 ft: 13903 corp: 10/105b lim: 35 exec/s: 0 rss: 67Mb L: 9/20 MS: 1 InsertByte- 00:09:13.247 [2024-10-04 08:27:05.706208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.247 [2024-10-04 08:27:05.706244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.247 #21 NEW cov: 11787 ft: 13951 corp: 11/114b lim: 35 exec/s: 0 rss: 67Mb L: 9/20 MS: 1 ShuffleBytes- 00:09:13.247 [2024-10-04 08:27:05.746381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.247 [2024-10-04 08:27:05.746410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.247 #22 NEW cov: 11787 ft: 13987 corp: 12/123b lim: 35 exec/s: 0 rss: 67Mb L: 9/20 MS: 1 InsertByte- 00:09:13.247 [2024-10-04 08:27:05.786510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.247 [2024-10-04 08:27:05.786540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.247 #23 NEW cov: 11787 ft: 14017 corp: 13/136b lim: 35 exec/s: 0 rss: 68Mb L: 13/20 MS: 1 CMP- DE: "\377\377\377\377"- 00:09:13.247 [2024-10-04 08:27:05.826804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.247 [2024-10-04 08:27:05.826832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.247 [2024-10-04 08:27:05.826979] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.247 [2024-10-04 08:27:05.826997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.247 #24 NEW cov: 11787 ft: 14104 corp: 14/155b lim: 35 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 EraseBytes- 00:09:13.247 [2024-10-04 08:27:05.876689] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.247 [2024-10-04 08:27:05.876721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.247 #25 NEW cov: 11787 ft: 14116 corp: 15/166b lim: 35 exec/s: 0 rss: 68Mb L: 11/20 MS: 1 CopyPart- 00:09:13.247 [2024-10-04 08:27:05.916753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.247 [2024-10-04 08:27:05.916785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.506 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:13.506 #26 NEW cov: 11810 ft: 14151 corp: 16/175b lim: 35 exec/s: 0 rss: 68Mb L: 9/20 MS: 1 ChangeByte- 00:09:13.506 [2024-10-04 08:27:05.957731] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.506 [2024-10-04 08:27:05.957768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.507 [2024-10-04 08:27:05.957917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:05.957940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.507 [2024-10-04 08:27:05.958069] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:05.958088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.507 [2024-10-04 08:27:05.958220] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:05.958242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:13.507 #29 NEW cov: 11810 ft: 14462 corp: 17/207b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:09:13.507 [2024-10-04 08:27:06.007356] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:06.007384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.507 [2024-10-04 08:27:06.007523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:06.007539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.507 #30 NEW cov: 11810 ft: 14500 corp: 18/222b lim: 35 exec/s: 30 rss: 68Mb L: 15/32 MS: 1 EraseBytes- 00:09:13.507 [2024-10-04 08:27:06.047218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:06.047251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.507 #31 NEW cov: 11810 ft: 14579 corp: 19/233b lim: 35 exec/s: 31 rss: 68Mb L: 11/32 MS: 1 CopyPart- 00:09:13.507 [2024-10-04 08:27:06.087341] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:06.087369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.507 #32 NEW cov: 11810 ft: 14585 corp: 20/241b lim: 35 exec/s: 32 rss: 68Mb L: 8/32 MS: 1 ShuffleBytes- 00:09:13.507 [2024-10-04 08:27:06.127440] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:06.127471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.507 #33 NEW cov: 11810 ft: 14596 corp: 21/253b lim: 35 exec/s: 33 rss: 68Mb L: 12/32 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:09:13.507 [2024-10-04 08:27:06.167813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:06.167846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.507 [2024-10-04 08:27:06.167991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.507 [2024-10-04 08:27:06.168014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.766 #34 NEW cov: 11810 ft: 14619 corp: 22/267b lim: 35 exec/s: 34 rss: 68Mb L: 14/32 MS: 1 ShuffleBytes- 00:09:13.766 [2024-10-04 08:27:06.207853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.207887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.766 [2024-10-04 08:27:06.208014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000009c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.208035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.766 #35 NEW cov: 11810 ft: 14640 corp: 23/283b lim: 35 exec/s: 35 rss: 68Mb L: 16/32 MS: 1 InsertRepeatedBytes- 00:09:13.766 [2024-10-04 08:27:06.248533] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.248566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.766 [2024-10-04 08:27:06.248724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000006e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.248742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.766 [2024-10-04 08:27:06.248875] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000006e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.248893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.766 [2024-10-04 08:27:06.249026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.249043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:13.766 #36 NEW cov: 11810 ft: 14655 corp: 24/312b lim: 35 exec/s: 36 rss: 68Mb L: 29/32 MS: 1 InsertRepeatedBytes- 00:09:13.766 [2024-10-04 08:27:06.298203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.298233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.766 [2024-10-04 08:27:06.298378] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000026 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.298394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.766 #37 NEW cov: 11810 ft: 14667 corp: 25/329b lim: 35 exec/s: 37 rss: 68Mb L: 17/32 MS: 1 CMP- DE: "\036\000"- 00:09:13.766 [2024-10-04 08:27:06.338332] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.338363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.766 [2024-10-04 08:27:06.338510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.338533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.766 #38 NEW cov: 11810 ft: 14735 corp: 26/343b lim: 35 exec/s: 38 rss: 68Mb L: 14/32 MS: 1 ChangeBinInt- 00:09:13.766 [2024-10-04 08:27:06.378653] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.378681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.766 [2024-10-04 08:27:06.378822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.766 [2024-10-04 08:27:06.378848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.767 [2024-10-04 08:27:06.378993] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.767 [2024-10-04 08:27:06.379017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:13.767 #40 NEW cov: 11810 ft: 14971 corp: 27/364b lim: 35 exec/s: 40 rss: 68Mb L: 21/32 MS: 2 EraseBytes-InsertRepeatedBytes- 00:09:13.767 [2024-10-04 08:27:06.418344] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.767 [2024-10-04 08:27:06.418372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.767 #41 NEW cov: 11810 ft: 14981 corp: 28/377b lim: 35 exec/s: 41 rss: 69Mb L: 13/32 MS: 1 ChangeByte- 00:09:14.026 [2024-10-04 08:27:06.468470] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.468503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.027 #42 NEW cov: 11810 ft: 15022 corp: 29/390b lim: 35 exec/s: 42 rss: 69Mb L: 13/32 MS: 1 CMP- DE: "\000\000\000\002"- 00:09:14.027 [2024-10-04 08:27:06.508512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.508540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.027 #43 NEW cov: 11810 ft: 15031 corp: 30/399b lim: 35 exec/s: 43 rss: 69Mb L: 9/32 MS: 1 EraseBytes- 00:09:14.027 [2024-10-04 08:27:06.549489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.549522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.027 [2024-10-04 08:27:06.549654] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000006e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.549672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.027 [2024-10-04 08:27:06.549816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000006e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.549834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.027 [2024-10-04 08:27:06.549963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000026 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.549980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.027 #44 NEW cov: 11810 ft: 15046 corp: 31/427b lim: 35 exec/s: 44 rss: 69Mb L: 28/32 MS: 1 EraseBytes- 00:09:14.027 [2024-10-04 08:27:06.598794] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.598823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.027 #45 NEW cov: 11810 ft: 15068 corp: 32/436b lim: 35 exec/s: 45 rss: 69Mb L: 9/32 MS: 1 ChangeBinInt- 00:09:14.027 [2024-10-04 08:27:06.639784] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.639817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.027 [2024-10-04 08:27:06.639953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000009c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.639974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.027 [2024-10-04 08:27:06.640094] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.640110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.027 [2024-10-04 08:27:06.640233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000009c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.640255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.027 #46 NEW cov: 11810 ft: 15087 corp: 33/465b lim: 35 exec/s: 46 rss: 69Mb L: 29/32 MS: 1 CopyPart- 00:09:14.027 [2024-10-04 08:27:06.689338] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.689371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.027 [2024-10-04 08:27:06.689490] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.027 [2024-10-04 08:27:06.689505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.286 #47 NEW cov: 11810 ft: 15107 corp: 34/480b lim: 35 exec/s: 47 rss: 69Mb L: 15/32 MS: 1 ChangeBinInt- 00:09:14.286 [2024-10-04 08:27:06.729287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.286 [2024-10-04 08:27:06.729320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.286 #48 NEW cov: 11810 ft: 15114 corp: 35/493b lim: 35 exec/s: 48 rss: 69Mb L: 13/32 MS: 1 ChangeBit- 00:09:14.286 [2024-10-04 08:27:06.779793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.286 [2024-10-04 08:27:06.779824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.287 [2024-10-04 08:27:06.779967] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.779989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.287 #49 NEW cov: 11810 ft: 15146 corp: 36/507b lim: 35 exec/s: 49 rss: 69Mb L: 14/32 MS: 1 ShuffleBytes- 00:09:14.287 [2024-10-04 08:27:06.819572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.819600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.287 #50 NEW cov: 11810 ft: 15148 corp: 37/517b lim: 35 exec/s: 50 rss: 69Mb L: 10/32 MS: 1 CrossOver- 00:09:14.287 [2024-10-04 08:27:06.859881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.859911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.287 [2024-10-04 08:27:06.860028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.860047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.287 #51 NEW cov: 11810 ft: 15157 corp: 38/531b lim: 35 exec/s: 51 rss: 69Mb L: 14/32 MS: 1 ShuffleBytes- 00:09:14.287 [2024-10-04 08:27:06.910550] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.910586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.287 [2024-10-04 08:27:06.910707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000009c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.910727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.287 [2024-10-04 08:27:06.910858] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.910881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.287 [2024-10-04 08:27:06.911013] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000009c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.911033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.287 #52 NEW cov: 11810 ft: 15167 corp: 39/561b lim: 35 exec/s: 52 rss: 69Mb L: 30/32 MS: 1 InsertByte- 00:09:14.287 [2024-10-04 08:27:06.959891] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.287 [2024-10-04 08:27:06.959924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.546 #53 NEW cov: 11810 ft: 15185 corp: 40/569b lim: 35 exec/s: 53 rss: 69Mb L: 8/32 MS: 1 EraseBytes- 00:09:14.546 [2024-10-04 08:27:07.000850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.546 [2024-10-04 08:27:07.000880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.546 [2024-10-04 08:27:07.001014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000009c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.546 [2024-10-04 08:27:07.001037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.546 [2024-10-04 08:27:07.001172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.546 [2024-10-04 08:27:07.001193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.546 [2024-10-04 08:27:07.001320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000009c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.546 [2024-10-04 08:27:07.001341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:14.546 #54 NEW cov: 11810 ft: 15196 corp: 41/599b lim: 35 exec/s: 27 rss: 69Mb L: 30/32 MS: 1 InsertByte- 00:09:14.546 #54 DONE cov: 11810 ft: 15196 corp: 41/599b lim: 35 exec/s: 27 rss: 69Mb 00:09:14.546 ###### Recommended dictionary. ###### 00:09:14.546 "\377\377\377\377" # Uses: 1 00:09:14.546 "\036\000" # Uses: 0 00:09:14.546 "\000\000\000\002" # Uses: 0 00:09:14.546 ###### End of recommended dictionary. ###### 00:09:14.546 Done 54 runs in 2 second(s) 00:09:14.546 08:27:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:09:14.546 08:27:07 -- ../common.sh@72 -- # (( i++ )) 00:09:14.546 08:27:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:14.546 08:27:07 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:09:14.546 08:27:07 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:09:14.546 08:27:07 -- nvmf/run.sh@24 -- # local timen=1 00:09:14.546 08:27:07 -- nvmf/run.sh@25 -- # local core=0x1 00:09:14.546 08:27:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:14.546 08:27:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:09:14.546 08:27:07 -- nvmf/run.sh@29 -- # printf %02d 15 00:09:14.546 08:27:07 -- nvmf/run.sh@29 -- # port=4415 00:09:14.546 08:27:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:14.546 08:27:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:09:14.546 08:27:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:14.546 08:27:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:09:14.546 [2024-10-04 08:27:07.175120] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:14.546 [2024-10-04 08:27:07.175197] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1019353 ] 00:09:14.546 EAL: No free 2048 kB hugepages reported on node 1 00:09:14.806 [2024-10-04 08:27:07.350277] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.806 [2024-10-04 08:27:07.369072] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:14.806 [2024-10-04 08:27:07.369197] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.806 [2024-10-04 08:27:07.420403] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:14.806 [2024-10-04 08:27:07.436726] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:09:14.806 INFO: Running with entropic power schedule (0xFF, 100). 00:09:14.806 INFO: Seed: 1274844797 00:09:14.806 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:14.806 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:14.806 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:14.806 INFO: A corpus is not provided, starting from an empty corpus 00:09:14.806 #2 INITED exec/s: 0 rss: 59Mb 00:09:14.806 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:14.806 This may also happen if the target rejected all inputs we tried so far 00:09:14.806 [2024-10-04 08:27:07.482301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.806 [2024-10-04 08:27:07.482332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.806 [2024-10-04 08:27:07.482394] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.806 [2024-10-04 08:27:07.482409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.806 [2024-10-04 08:27:07.482468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.806 [2024-10-04 08:27:07.482482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.806 [2024-10-04 08:27:07.482535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:14.806 [2024-10-04 08:27:07.482549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.323 NEW_FUNC[1/670]: 0x466a08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:09:15.323 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:15.324 #9 NEW cov: 11564 ft: 11564 corp: 2/33b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 2 CMP-InsertRepeatedBytes- DE: "\377?"- 00:09:15.324 [2024-10-04 08:27:07.773039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.773072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.773132] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.773147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.773208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.773222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.773278] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.773291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.324 #10 NEW cov: 11677 ft: 12009 corp: 3/65b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:15.324 [2024-10-04 08:27:07.823078] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.823105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.823164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.823179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.823243] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.823257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.823318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.823331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.324 #11 NEW cov: 11683 ft: 12176 corp: 4/97b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 CrossOver- 00:09:15.324 [2024-10-04 08:27:07.863339] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.863365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.863422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.863436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.863494] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.863507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.863566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.863584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.863641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.863655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:15.324 #12 NEW cov: 11768 ft: 12454 corp: 5/132b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 CopyPart- 00:09:15.324 [2024-10-04 08:27:07.903060] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.903086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.903144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.903158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.324 #13 NEW cov: 11768 ft: 13079 corp: 6/150b lim: 35 exec/s: 0 rss: 67Mb L: 18/35 MS: 1 EraseBytes- 00:09:15.324 [2024-10-04 08:27:07.943581] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.943608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.943667] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.943682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.943739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.943753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.943813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.943827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.943886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.943900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:15.324 #14 NEW cov: 11768 ft: 13176 corp: 7/185b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ChangeBit- 00:09:15.324 [2024-10-04 08:27:07.983563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.983590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.983655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.983671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.983733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.983747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.324 [2024-10-04 08:27:07.983810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.324 [2024-10-04 08:27:07.983827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.324 #15 NEW cov: 11768 ft: 13349 corp: 8/217b lim: 35 exec/s: 0 rss: 67Mb L: 32/35 MS: 1 ChangeBit- 00:09:15.582 [2024-10-04 08:27:08.023672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.023698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.023759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.023773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.023830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.023844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.023901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.023915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.582 #16 NEW cov: 11768 ft: 13425 corp: 9/249b lim: 35 exec/s: 0 rss: 67Mb L: 32/35 MS: 1 CopyPart- 00:09:15.582 [2024-10-04 08:27:08.063482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.063509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.063569] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.063584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.582 #17 NEW cov: 11768 ft: 13497 corp: 10/268b lim: 35 exec/s: 0 rss: 67Mb L: 19/35 MS: 1 InsertByte- 00:09:15.582 [2024-10-04 08:27:08.103867] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.103893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.103954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.103968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.104027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.104041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.104098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.104112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.582 #18 NEW cov: 11768 ft: 13528 corp: 11/301b lim: 35 exec/s: 0 rss: 67Mb L: 33/35 MS: 1 InsertByte- 00:09:15.582 [2024-10-04 08:27:08.143997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.144024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.144086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.144099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.144156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.144170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.144229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.144243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.582 #19 NEW cov: 11768 ft: 13645 corp: 12/334b lim: 35 exec/s: 0 rss: 68Mb L: 33/35 MS: 1 ChangeByte- 00:09:15.582 [2024-10-04 08:27:08.183857] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.183883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.183940] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000003f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.183954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.582 #20 NEW cov: 11768 ft: 13754 corp: 13/350b lim: 35 exec/s: 0 rss: 68Mb L: 16/35 MS: 1 CrossOver- 00:09:15.582 [2024-10-04 08:27:08.223994] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.224022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.582 [2024-10-04 08:27:08.224081] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000003f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.582 [2024-10-04 08:27:08.224096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.582 #21 NEW cov: 11768 ft: 13786 corp: 14/367b lim: 35 exec/s: 0 rss: 68Mb L: 17/35 MS: 1 InsertByte- 00:09:15.842 [2024-10-04 08:27:08.264376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.264403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.264467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.264482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.264545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.264559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.264620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.264635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.842 #22 NEW cov: 11768 ft: 13795 corp: 15/399b lim: 35 exec/s: 0 rss: 68Mb L: 32/35 MS: 1 ChangeBinInt- 00:09:15.842 [2024-10-04 08:27:08.304213] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.304251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.304315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.304329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.842 #23 NEW cov: 11768 ft: 13804 corp: 16/417b lim: 35 exec/s: 0 rss: 68Mb L: 18/35 MS: 1 ChangeBinInt- 00:09:15.842 [2024-10-04 08:27:08.344594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.344620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.344684] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.344698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.344759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.344773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.344837] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.344851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.842 #24 NEW cov: 11768 ft: 13833 corp: 17/449b lim: 35 exec/s: 0 rss: 68Mb L: 32/35 MS: 1 ChangeByte- 00:09:15.842 [2024-10-04 08:27:08.384874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.384901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.384964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.384977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.385038] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.385052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.385110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.385124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.385182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.385201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:15.842 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:15.842 #25 NEW cov: 11791 ft: 13877 corp: 18/484b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 ChangeByte- 00:09:15.842 [2024-10-04 08:27:08.424901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.424928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.424989] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.425003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.425065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.425079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.425140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.425153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.842 [2024-10-04 08:27:08.425214] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.425228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:15.842 #26 NEW cov: 11791 ft: 13954 corp: 19/519b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 PersAutoDict- DE: "\377?"- 00:09:15.842 [2024-10-04 08:27:08.464662] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.842 [2024-10-04 08:27:08.464689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.843 [2024-10-04 08:27:08.464749] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.843 [2024-10-04 08:27:08.464764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.843 #27 NEW cov: 11791 ft: 14028 corp: 20/537b lim: 35 exec/s: 27 rss: 68Mb L: 18/35 MS: 1 ShuffleBytes- 00:09:15.843 [2024-10-04 08:27:08.505204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.843 [2024-10-04 08:27:08.505231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:15.843 [2024-10-04 08:27:08.505291] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.843 [2024-10-04 08:27:08.505306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:15.843 [2024-10-04 08:27:08.505368] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.843 [2024-10-04 08:27:08.505383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:15.843 [2024-10-04 08:27:08.505444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.843 [2024-10-04 08:27:08.505458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:15.843 [2024-10-04 08:27:08.505515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:15.843 [2024-10-04 08:27:08.505530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.102 #28 NEW cov: 11791 ft: 14033 corp: 21/572b lim: 35 exec/s: 28 rss: 68Mb L: 35/35 MS: 1 CopyPart- 00:09:16.102 [2024-10-04 08:27:08.545178] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.102 [2024-10-04 08:27:08.545211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.102 [2024-10-04 08:27:08.545273] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.102 [2024-10-04 08:27:08.545287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.102 [2024-10-04 08:27:08.545352] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.102 [2024-10-04 08:27:08.545365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.102 [2024-10-04 08:27:08.545430] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.102 [2024-10-04 08:27:08.545444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.102 #29 NEW cov: 11791 ft: 14051 corp: 22/604b lim: 35 exec/s: 29 rss: 68Mb L: 32/35 MS: 1 ChangeBit- 00:09:16.102 [2024-10-04 08:27:08.585266] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.102 [2024-10-04 08:27:08.585291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.102 [2024-10-04 08:27:08.585350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.102 [2024-10-04 08:27:08.585364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.102 [2024-10-04 08:27:08.585424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.585438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.585500] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.585513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.103 #30 NEW cov: 11791 ft: 14056 corp: 23/638b lim: 35 exec/s: 30 rss: 68Mb L: 34/35 MS: 1 InsertByte- 00:09:16.103 [2024-10-04 08:27:08.625424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.625450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.625512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.625526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.625641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.625655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.103 NEW_FUNC[1/1]: 0x4843a8 in feat_number_of_queues /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:318 00:09:16.103 #31 NEW cov: 11823 ft: 14243 corp: 24/670b lim: 35 exec/s: 31 rss: 68Mb L: 32/35 MS: 1 ShuffleBytes- 00:09:16.103 [2024-10-04 08:27:08.665505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.665530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.665596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.665610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.665671] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.665685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.665743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.665757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.103 #32 NEW cov: 11823 ft: 14260 corp: 25/704b lim: 35 exec/s: 32 rss: 69Mb L: 34/35 MS: 1 ShuffleBytes- 00:09:16.103 [2024-10-04 08:27:08.705473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.705499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.705560] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.705574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.705635] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.705649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.705707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.705721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.103 #33 NEW cov: 11823 ft: 14270 corp: 26/736b lim: 35 exec/s: 33 rss: 69Mb L: 32/35 MS: 1 CopyPart- 00:09:16.103 [2024-10-04 08:27:08.745897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.745923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.745983] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.745997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.746056] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.746070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.746128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.746141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.103 [2024-10-04 08:27:08.746203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.103 [2024-10-04 08:27:08.746216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.103 #34 NEW cov: 11823 ft: 14321 corp: 27/771b lim: 35 exec/s: 34 rss: 69Mb L: 35/35 MS: 1 PersAutoDict- DE: "\377?"- 00:09:16.363 [2024-10-04 08:27:08.786025] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.786051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.786114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.786129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.786190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.786205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.786266] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.786279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.786339] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.786354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.363 #35 NEW cov: 11823 ft: 14329 corp: 28/806b lim: 35 exec/s: 35 rss: 69Mb L: 35/35 MS: 1 CMP- DE: "\000\000\000\021"- 00:09:16.363 [2024-10-04 08:27:08.825702] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.825729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.363 NEW_FUNC[1/1]: 0x481108 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:09:16.363 #36 NEW cov: 11846 ft: 14554 corp: 29/825b lim: 35 exec/s: 36 rss: 69Mb L: 19/35 MS: 1 ChangeBit- 00:09:16.363 [2024-10-04 08:27:08.865805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.865831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.865888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.865903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.363 #37 NEW cov: 11846 ft: 14564 corp: 30/844b lim: 35 exec/s: 37 rss: 69Mb L: 19/35 MS: 1 ChangeByte- 00:09:16.363 [2024-10-04 08:27:08.906325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.906351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.906410] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.906424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.906484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.906498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.906561] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.906575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.906634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.906648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.363 #38 NEW cov: 11846 ft: 14567 corp: 31/879b lim: 35 exec/s: 38 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:09:16.363 NEW_FUNC[1/1]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:09:16.363 #39 NEW cov: 11860 ft: 14840 corp: 32/886b lim: 35 exec/s: 39 rss: 69Mb L: 7/35 MS: 1 InsertRepeatedBytes- 00:09:16.363 [2024-10-04 08:27:08.986536] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.986563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.986620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000723 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.986634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.986692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.986706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.986762] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.986777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.363 [2024-10-04 08:27:08.986831] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.363 [2024-10-04 08:27:08.986845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.363 #40 NEW cov: 11860 ft: 14854 corp: 33/921b lim: 35 exec/s: 40 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:09:16.623 #41 NEW cov: 11860 ft: 14871 corp: 34/934b lim: 35 exec/s: 41 rss: 69Mb L: 13/35 MS: 1 CopyPart- 00:09:16.623 [2024-10-04 08:27:09.066620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.066646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.623 [2024-10-04 08:27:09.066703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.066717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.623 [2024-10-04 08:27:09.066773] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.066787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.623 [2024-10-04 08:27:09.066843] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.066856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.623 #42 NEW cov: 11860 ft: 14881 corp: 35/968b lim: 35 exec/s: 42 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:09:16.623 [2024-10-04 08:27:09.106616] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.106642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.623 [2024-10-04 08:27:09.106703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.106718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.623 [2024-10-04 08:27:09.106778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.106792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.623 #43 NEW cov: 11860 ft: 15031 corp: 36/990b lim: 35 exec/s: 43 rss: 69Mb L: 22/35 MS: 1 PersAutoDict- DE: "\000\000\000\021"- 00:09:16.623 [2024-10-04 08:27:09.146879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.146904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.623 [2024-10-04 08:27:09.146961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.623 [2024-10-04 08:27:09.146976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.623 [2024-10-04 08:27:09.147035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.147049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.147106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.147120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.624 #44 NEW cov: 11860 ft: 15061 corp: 37/1022b lim: 35 exec/s: 44 rss: 69Mb L: 32/35 MS: 1 ShuffleBytes- 00:09:16.624 [2024-10-04 08:27:09.187152] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.187178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.187242] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.187257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.187313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.187327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.187384] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.187398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.187453] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.187470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.624 #45 NEW cov: 11860 ft: 15064 corp: 38/1057b lim: 35 exec/s: 45 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:09:16.624 [2024-10-04 08:27:09.227218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.227243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.227305] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.227320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.227380] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.227394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.227455] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.227469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.227526] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.227540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.624 #46 NEW cov: 11860 ft: 15066 corp: 39/1092b lim: 35 exec/s: 46 rss: 69Mb L: 35/35 MS: 1 PersAutoDict- DE: "\377?"- 00:09:16.624 [2024-10-04 08:27:09.257145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.257171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.257235] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.257250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.257308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.257322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.257392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.257405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.624 #47 NEW cov: 11860 ft: 15075 corp: 40/1122b lim: 35 exec/s: 47 rss: 69Mb L: 30/35 MS: 1 EraseBytes- 00:09:16.624 [2024-10-04 08:27:09.297425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.297451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.297514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.297528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.297591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.297608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.297667] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.297682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.624 [2024-10-04 08:27:09.297741] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.624 [2024-10-04 08:27:09.297755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.884 #48 NEW cov: 11860 ft: 15082 corp: 41/1157b lim: 35 exec/s: 48 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:09:16.884 [2024-10-04 08:27:09.337430] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.884 [2024-10-04 08:27:09.337456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.884 [2024-10-04 08:27:09.337498] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.337511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.337570] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.337584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.337642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.337655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.885 #49 NEW cov: 11860 ft: 15088 corp: 42/1187b lim: 35 exec/s: 49 rss: 69Mb L: 30/35 MS: 1 EraseBytes- 00:09:16.885 [2024-10-04 08:27:09.367496] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.367521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.367580] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.367594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.367651] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.367665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.367720] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.367734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.885 #50 NEW cov: 11860 ft: 15121 corp: 43/1219b lim: 35 exec/s: 50 rss: 69Mb L: 32/35 MS: 1 CopyPart- 00:09:16.885 [2024-10-04 08:27:09.397318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.397345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.397405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.397419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.885 #51 NEW cov: 11860 ft: 15134 corp: 44/1235b lim: 35 exec/s: 51 rss: 69Mb L: 16/35 MS: 1 EraseBytes- 00:09:16.885 [2024-10-04 08:27:09.437688] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.437715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.437772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.437786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.437842] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.437855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.437913] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.437927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.885 #52 NEW cov: 11860 ft: 15168 corp: 45/1265b lim: 35 exec/s: 52 rss: 70Mb L: 30/35 MS: 1 ShuffleBytes- 00:09:16.885 [2024-10-04 08:27:09.477977] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.478004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.478062] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.478076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.478136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.478150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.478211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.478226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.885 [2024-10-04 08:27:09.478282] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.885 [2024-10-04 08:27:09.478296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:16.885 #53 NEW cov: 11860 ft: 15175 corp: 46/1300b lim: 35 exec/s: 26 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:09:16.885 #53 DONE cov: 11860 ft: 15175 corp: 46/1300b lim: 35 exec/s: 26 rss: 70Mb 00:09:16.885 ###### Recommended dictionary. ###### 00:09:16.885 "\377?" # Uses: 3 00:09:16.885 "\000\000\000\021" # Uses: 1 00:09:16.885 ###### End of recommended dictionary. ###### 00:09:16.885 Done 53 runs in 2 second(s) 00:09:17.145 08:27:09 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:09:17.145 08:27:09 -- ../common.sh@72 -- # (( i++ )) 00:09:17.145 08:27:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:17.145 08:27:09 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:09:17.145 08:27:09 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:09:17.145 08:27:09 -- nvmf/run.sh@24 -- # local timen=1 00:09:17.145 08:27:09 -- nvmf/run.sh@25 -- # local core=0x1 00:09:17.145 08:27:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:17.145 08:27:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:09:17.145 08:27:09 -- nvmf/run.sh@29 -- # printf %02d 16 00:09:17.145 08:27:09 -- nvmf/run.sh@29 -- # port=4416 00:09:17.145 08:27:09 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:17.145 08:27:09 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:09:17.145 08:27:09 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:17.145 08:27:09 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:09:17.145 [2024-10-04 08:27:09.660955] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:17.145 [2024-10-04 08:27:09.661024] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1019710 ] 00:09:17.145 EAL: No free 2048 kB hugepages reported on node 1 00:09:17.404 [2024-10-04 08:27:09.838603] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.405 [2024-10-04 08:27:09.858532] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:17.405 [2024-10-04 08:27:09.858650] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:17.405 [2024-10-04 08:27:09.909899] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:17.405 [2024-10-04 08:27:09.926232] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:09:17.405 INFO: Running with entropic power schedule (0xFF, 100). 00:09:17.405 INFO: Seed: 3762844437 00:09:17.405 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:17.405 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:17.405 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:17.405 INFO: A corpus is not provided, starting from an empty corpus 00:09:17.405 #2 INITED exec/s: 0 rss: 59Mb 00:09:17.405 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:17.405 This may also happen if the target rejected all inputs we tried so far 00:09:17.405 [2024-10-04 08:27:09.971462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070102450175 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.405 [2024-10-04 08:27:09.971494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.405 [2024-10-04 08:27:09.971543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.405 [2024-10-04 08:27:09.971559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:17.405 [2024-10-04 08:27:09.971608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.405 [2024-10-04 08:27:09.971624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:17.664 NEW_FUNC[1/671]: 0x467ec8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:09:17.664 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:17.664 #4 NEW cov: 11667 ft: 11645 corp: 2/66b lim: 105 exec/s: 0 rss: 67Mb L: 65/65 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:17.664 [2024-10-04 08:27:10.282143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.664 [2024-10-04 08:27:10.282199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.664 #7 NEW cov: 11780 ft: 12700 corp: 3/93b lim: 105 exec/s: 0 rss: 67Mb L: 27/65 MS: 3 CrossOver-InsertByte-InsertRepeatedBytes- 00:09:17.664 [2024-10-04 08:27:10.322074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.664 [2024-10-04 08:27:10.322103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.923 #8 NEW cov: 11786 ft: 13022 corp: 4/120b lim: 105 exec/s: 0 rss: 67Mb L: 27/65 MS: 1 ChangeByte- 00:09:17.923 [2024-10-04 08:27:10.362268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380927 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.923 [2024-10-04 08:27:10.362299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.923 #11 NEW cov: 11871 ft: 13350 corp: 5/149b lim: 105 exec/s: 0 rss: 67Mb L: 29/65 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:09:17.923 [2024-10-04 08:27:10.402371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073441116159 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.924 [2024-10-04 08:27:10.402400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.924 #17 NEW cov: 11871 ft: 13464 corp: 6/176b lim: 105 exec/s: 0 rss: 67Mb L: 27/65 MS: 1 ChangeBit- 00:09:17.924 [2024-10-04 08:27:10.442507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380927 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.924 [2024-10-04 08:27:10.442535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.924 #18 NEW cov: 11871 ft: 13533 corp: 7/205b lim: 105 exec/s: 0 rss: 67Mb L: 29/65 MS: 1 ChangeBinInt- 00:09:17.924 [2024-10-04 08:27:10.482583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.924 [2024-10-04 08:27:10.482611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.924 #19 NEW cov: 11871 ft: 13602 corp: 8/233b lim: 105 exec/s: 0 rss: 67Mb L: 28/65 MS: 1 InsertByte- 00:09:17.924 [2024-10-04 08:27:10.522678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380927 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.924 [2024-10-04 08:27:10.522707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.924 #20 NEW cov: 11871 ft: 13625 corp: 9/262b lim: 105 exec/s: 0 rss: 67Mb L: 29/65 MS: 1 ChangeBinInt- 00:09:17.924 [2024-10-04 08:27:10.562777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.924 [2024-10-04 08:27:10.562806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:17.924 #21 NEW cov: 11871 ft: 13654 corp: 10/289b lim: 105 exec/s: 0 rss: 67Mb L: 27/65 MS: 1 ChangeBinInt- 00:09:17.924 [2024-10-04 08:27:10.602935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380682 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.924 [2024-10-04 08:27:10.602966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.183 #22 NEW cov: 11871 ft: 13699 corp: 11/318b lim: 105 exec/s: 0 rss: 67Mb L: 29/65 MS: 1 CrossOver- 00:09:18.183 [2024-10-04 08:27:10.643361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070102450175 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.643388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.183 [2024-10-04 08:27:10.643426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.643441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.183 [2024-10-04 08:27:10.643495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.643510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.183 #23 NEW cov: 11871 ft: 13745 corp: 12/383b lim: 105 exec/s: 0 rss: 67Mb L: 65/65 MS: 1 ChangeBinInt- 00:09:18.183 [2024-10-04 08:27:10.683184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073441116159 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.683217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.183 #24 NEW cov: 11871 ft: 13748 corp: 13/410b lim: 105 exec/s: 0 rss: 67Mb L: 27/65 MS: 1 CopyPart- 00:09:18.183 [2024-10-04 08:27:10.723317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.723345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.183 #25 NEW cov: 11871 ft: 13762 corp: 14/437b lim: 105 exec/s: 0 rss: 67Mb L: 27/65 MS: 1 ShuffleBytes- 00:09:18.183 [2024-10-04 08:27:10.763407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.763437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.183 #26 NEW cov: 11871 ft: 13772 corp: 15/464b lim: 105 exec/s: 0 rss: 67Mb L: 27/65 MS: 1 ChangeByte- 00:09:18.183 [2024-10-04 08:27:10.803789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.803818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.183 [2024-10-04 08:27:10.803853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.803868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.183 [2024-10-04 08:27:10.803923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.803938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.183 #27 NEW cov: 11871 ft: 13819 corp: 16/532b lim: 105 exec/s: 0 rss: 68Mb L: 68/68 MS: 1 CrossOver- 00:09:18.183 [2024-10-04 08:27:10.844066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.844095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.183 [2024-10-04 08:27:10.844137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.844153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.183 [2024-10-04 08:27:10.844213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.844229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.183 [2024-10-04 08:27:10.844293] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2459565876494606882 len:8739 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.183 [2024-10-04 08:27:10.844310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:18.442 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:18.442 #28 NEW cov: 11894 ft: 14400 corp: 17/632b lim: 105 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:09:18.442 [2024-10-04 08:27:10.893824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380682 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.442 [2024-10-04 08:27:10.893853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.442 #29 NEW cov: 11894 ft: 14416 corp: 18/654b lim: 105 exec/s: 0 rss: 68Mb L: 22/100 MS: 1 EraseBytes- 00:09:18.442 [2024-10-04 08:27:10.933954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65528 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.442 [2024-10-04 08:27:10.933981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.442 #30 NEW cov: 11894 ft: 14428 corp: 19/681b lim: 105 exec/s: 0 rss: 68Mb L: 27/100 MS: 1 ChangeBit- 00:09:18.442 [2024-10-04 08:27:10.974393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070102450175 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.442 [2024-10-04 08:27:10.974422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.442 [2024-10-04 08:27:10.974473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.442 [2024-10-04 08:27:10.974489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.442 [2024-10-04 08:27:10.974543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.442 [2024-10-04 08:27:10.974560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.442 [2024-10-04 08:27:10.974613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.442 [2024-10-04 08:27:10.974629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:18.442 #31 NEW cov: 11894 ft: 14459 corp: 20/771b lim: 105 exec/s: 31 rss: 68Mb L: 90/100 MS: 1 CopyPart- 00:09:18.442 [2024-10-04 08:27:11.014155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380682 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.442 [2024-10-04 08:27:11.014184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.442 #32 NEW cov: 11894 ft: 14473 corp: 21/800b lim: 105 exec/s: 32 rss: 68Mb L: 29/100 MS: 1 ShuffleBytes- 00:09:18.442 [2024-10-04 08:27:11.054291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380927 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.442 [2024-10-04 08:27:11.054320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.442 #33 NEW cov: 11894 ft: 14486 corp: 22/829b lim: 105 exec/s: 33 rss: 68Mb L: 29/100 MS: 1 ChangeByte- 00:09:18.443 [2024-10-04 08:27:11.094378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:792633532825796362 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.443 [2024-10-04 08:27:11.094409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.443 #34 NEW cov: 11894 ft: 14495 corp: 23/858b lim: 105 exec/s: 34 rss: 68Mb L: 29/100 MS: 1 ShuffleBytes- 00:09:18.702 [2024-10-04 08:27:11.134629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65528 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.134659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.702 [2024-10-04 08:27:11.134708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:13402712491054596095 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.134725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.702 #35 NEW cov: 11894 ft: 14797 corp: 24/919b lim: 105 exec/s: 35 rss: 68Mb L: 61/100 MS: 1 InsertRepeatedBytes- 00:09:18.702 [2024-10-04 08:27:11.174651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380927 len:2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.174678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.702 #36 NEW cov: 11894 ft: 14815 corp: 25/952b lim: 105 exec/s: 36 rss: 68Mb L: 33/100 MS: 1 CMP- DE: "\001\000\000\001"- 00:09:18.702 [2024-10-04 08:27:11.214704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072117626111 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.214733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.702 #37 NEW cov: 11894 ft: 14877 corp: 26/981b lim: 105 exec/s: 37 rss: 68Mb L: 29/100 MS: 1 ChangeBinInt- 00:09:18.702 [2024-10-04 08:27:11.254840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18413964934577651711 len:35724 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.254869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.702 #38 NEW cov: 11894 ft: 14900 corp: 27/1020b lim: 105 exec/s: 38 rss: 69Mb L: 39/100 MS: 1 InsertRepeatedBytes- 00:09:18.702 [2024-10-04 08:27:11.294979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070102450175 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.295007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.702 #39 NEW cov: 11894 ft: 14913 corp: 28/1053b lim: 105 exec/s: 39 rss: 69Mb L: 33/100 MS: 1 CrossOver- 00:09:18.702 [2024-10-04 08:27:11.335097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.335126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.702 #40 NEW cov: 11894 ft: 14923 corp: 29/1080b lim: 105 exec/s: 40 rss: 69Mb L: 27/100 MS: 1 ChangeByte- 00:09:18.702 [2024-10-04 08:27:11.375580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380927 len:91 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.375613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.702 [2024-10-04 08:27:11.375651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6510615555426900570 len:23131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.375667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.702 [2024-10-04 08:27:11.375721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6510615555426900570 len:23131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.375737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.702 [2024-10-04 08:27:11.375791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6510615555426900570 len:23131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.702 [2024-10-04 08:27:11.375806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:18.962 #41 NEW cov: 11894 ft: 14989 corp: 30/1177b lim: 105 exec/s: 41 rss: 69Mb L: 97/100 MS: 1 InsertRepeatedBytes- 00:09:18.962 [2024-10-04 08:27:11.415307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.415335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.962 #42 NEW cov: 11894 ft: 15010 corp: 31/1212b lim: 105 exec/s: 42 rss: 69Mb L: 35/100 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\006"- 00:09:18.962 [2024-10-04 08:27:11.455378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.455406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.962 #43 NEW cov: 11894 ft: 15056 corp: 32/1247b lim: 105 exec/s: 43 rss: 69Mb L: 35/100 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\006"- 00:09:18.962 [2024-10-04 08:27:11.496064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.496092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.962 [2024-10-04 08:27:11.496147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.496163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:18.962 [2024-10-04 08:27:11.496219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.496234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:18.962 [2024-10-04 08:27:11.496290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.496306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:18.962 [2024-10-04 08:27:11.496364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65407 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.496381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:18.962 #44 NEW cov: 11894 ft: 15120 corp: 33/1352b lim: 105 exec/s: 44 rss: 69Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:09:18.962 [2024-10-04 08:27:11.535647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.535676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.962 #45 NEW cov: 11894 ft: 15185 corp: 34/1381b lim: 105 exec/s: 45 rss: 69Mb L: 29/105 MS: 1 InsertByte- 00:09:18.962 [2024-10-04 08:27:11.575723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65528 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.575751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.962 #46 NEW cov: 11894 ft: 15200 corp: 35/1408b lim: 105 exec/s: 46 rss: 69Mb L: 27/105 MS: 1 ShuffleBytes- 00:09:18.962 [2024-10-04 08:27:11.615972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.616001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:18.962 [2024-10-04 08:27:11.616039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4278190080 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.962 [2024-10-04 08:27:11.616056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.222 #52 NEW cov: 11894 ft: 15210 corp: 36/1460b lim: 105 exec/s: 52 rss: 69Mb L: 52/105 MS: 1 CopyPart- 00:09:19.222 [2024-10-04 08:27:11.655979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742978492891135 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.656008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.222 #53 NEW cov: 11894 ft: 15215 corp: 37/1488b lim: 105 exec/s: 53 rss: 69Mb L: 28/105 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:09:19.222 [2024-10-04 08:27:11.696614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.696642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.222 [2024-10-04 08:27:11.696695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.696712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.222 [2024-10-04 08:27:11.696767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.696783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.222 [2024-10-04 08:27:11.696836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.696852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:19.222 [2024-10-04 08:27:11.696906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65407 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.696921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:19.222 #54 NEW cov: 11894 ft: 15239 corp: 38/1593b lim: 105 exec/s: 54 rss: 69Mb L: 105/105 MS: 1 ChangeByte- 00:09:19.222 [2024-10-04 08:27:11.736199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65531 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.736229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.222 #55 NEW cov: 11894 ft: 15241 corp: 39/1620b lim: 105 exec/s: 55 rss: 69Mb L: 27/105 MS: 1 ChangeBinInt- 00:09:19.222 [2024-10-04 08:27:11.776542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65528 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.776570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.222 [2024-10-04 08:27:11.776609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:13402712491054596095 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.776625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.222 [2024-10-04 08:27:11.776677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.776694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.222 #56 NEW cov: 11894 ft: 15289 corp: 40/1685b lim: 105 exec/s: 56 rss: 70Mb L: 65/105 MS: 1 CMP- DE: "\000\000\000\000"- 00:09:19.222 [2024-10-04 08:27:11.816672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070102450175 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.816699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.222 [2024-10-04 08:27:11.816735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.816751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:19.222 [2024-10-04 08:27:11.816807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.816822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:19.222 #57 NEW cov: 11894 ft: 15309 corp: 41/1751b lim: 105 exec/s: 57 rss: 70Mb L: 66/105 MS: 1 InsertByte- 00:09:19.222 [2024-10-04 08:27:11.856541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743152993379583 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.856568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.222 #58 NEW cov: 11894 ft: 15348 corp: 42/1780b lim: 105 exec/s: 58 rss: 70Mb L: 29/105 MS: 1 ChangeByte- 00:09:19.222 [2024-10-04 08:27:11.896659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380927 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.222 [2024-10-04 08:27:11.896689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.483 #59 NEW cov: 11894 ft: 15458 corp: 43/1810b lim: 105 exec/s: 59 rss: 70Mb L: 30/105 MS: 1 InsertByte- 00:09:19.483 [2024-10-04 08:27:11.936770] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072116380927 len:2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:19.483 [2024-10-04 08:27:11.936798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:19.483 #60 NEW cov: 11894 ft: 15487 corp: 44/1843b lim: 105 exec/s: 30 rss: 70Mb L: 33/105 MS: 1 ChangeByte- 00:09:19.483 #60 DONE cov: 11894 ft: 15487 corp: 44/1843b lim: 105 exec/s: 30 rss: 70Mb 00:09:19.483 ###### Recommended dictionary. ###### 00:09:19.483 "\001\000\000\001" # Uses: 0 00:09:19.483 "\000\000\000\000\000\000\000\006" # Uses: 2 00:09:19.483 "\000\000\000\000\000\000\000\000" # Uses: 0 00:09:19.483 "\000\000\000\000" # Uses: 0 00:09:19.483 ###### End of recommended dictionary. ###### 00:09:19.483 Done 60 runs in 2 second(s) 00:09:19.483 08:27:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:09:19.483 08:27:12 -- ../common.sh@72 -- # (( i++ )) 00:09:19.483 08:27:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:19.483 08:27:12 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:09:19.483 08:27:12 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:09:19.483 08:27:12 -- nvmf/run.sh@24 -- # local timen=1 00:09:19.483 08:27:12 -- nvmf/run.sh@25 -- # local core=0x1 00:09:19.483 08:27:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:19.483 08:27:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:09:19.483 08:27:12 -- nvmf/run.sh@29 -- # printf %02d 17 00:09:19.483 08:27:12 -- nvmf/run.sh@29 -- # port=4417 00:09:19.483 08:27:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:19.483 08:27:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:09:19.483 08:27:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:19.483 08:27:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:09:19.483 [2024-10-04 08:27:12.116339] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:19.483 [2024-10-04 08:27:12.116433] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1020184 ] 00:09:19.483 EAL: No free 2048 kB hugepages reported on node 1 00:09:19.743 [2024-10-04 08:27:12.293418] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:19.743 [2024-10-04 08:27:12.312356] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:19.743 [2024-10-04 08:27:12.312474] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.743 [2024-10-04 08:27:12.363725] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:19.743 [2024-10-04 08:27:12.380040] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:09:19.743 INFO: Running with entropic power schedule (0xFF, 100). 00:09:19.743 INFO: Seed: 1922879406 00:09:19.743 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:19.743 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:19.743 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:19.743 INFO: A corpus is not provided, starting from an empty corpus 00:09:19.743 #2 INITED exec/s: 0 rss: 59Mb 00:09:19.743 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:19.743 This may also happen if the target rejected all inputs we tried so far 00:09:20.002 [2024-10-04 08:27:12.424838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489207873350057 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.002 [2024-10-04 08:27:12.424876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.002 [2024-10-04 08:27:12.424912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.002 [2024-10-04 08:27:12.424935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.002 [2024-10-04 08:27:12.424966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.002 [2024-10-04 08:27:12.424983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:20.002 [2024-10-04 08:27:12.425011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.002 [2024-10-04 08:27:12.425028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:20.261 NEW_FUNC[1/672]: 0x46b1b8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:09:20.261 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:20.261 #14 NEW cov: 11688 ft: 11689 corp: 2/117b lim: 120 exec/s: 0 rss: 67Mb L: 116/116 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:20.261 [2024-10-04 08:27:12.767708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9187201948472803199 len:32640 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.261 [2024-10-04 08:27:12.767778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.261 [2024-10-04 08:27:12.767919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9187201950435737471 len:32640 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.261 [2024-10-04 08:27:12.767953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.261 [2024-10-04 08:27:12.768088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:9187201950435737471 len:32640 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.261 [2024-10-04 08:27:12.768116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:20.262 [2024-10-04 08:27:12.768257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9187201950435737471 len:32640 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.768290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:20.262 [2024-10-04 08:27:12.768397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:9187201950435737471 len:32640 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.768430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:20.262 #17 NEW cov: 11801 ft: 12328 corp: 3/237b lim: 120 exec/s: 0 rss: 67Mb L: 120/120 MS: 3 CrossOver-InsertByte-InsertRepeatedBytes- 00:09:20.262 [2024-10-04 08:27:12.806807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.806842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.262 [2024-10-04 08:27:12.806953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.806972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.262 #19 NEW cov: 11807 ft: 12955 corp: 4/300b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 2 ShuffleBytes-CrossOver- 00:09:20.262 [2024-10-04 08:27:12.846982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.847012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.262 [2024-10-04 08:27:12.847136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.847159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.262 #20 NEW cov: 11892 ft: 13206 corp: 5/363b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 1 CopyPart- 00:09:20.262 [2024-10-04 08:27:12.897182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.897218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.262 [2024-10-04 08:27:12.897325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.897350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.262 #21 NEW cov: 11892 ft: 13281 corp: 6/426b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 1 ChangeBinInt- 00:09:20.262 [2024-10-04 08:27:12.937280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.937307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.262 [2024-10-04 08:27:12.937399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.262 [2024-10-04 08:27:12.937420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.521 #22 NEW cov: 11892 ft: 13335 corp: 7/489b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 1 ChangeBit- 00:09:20.521 [2024-10-04 08:27:12.977338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.521 [2024-10-04 08:27:12.977369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.521 [2024-10-04 08:27:12.977492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.521 [2024-10-04 08:27:12.977512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.521 #23 NEW cov: 11892 ft: 13416 corp: 8/541b lim: 120 exec/s: 0 rss: 67Mb L: 52/120 MS: 1 EraseBytes- 00:09:20.521 [2024-10-04 08:27:13.017511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.521 [2024-10-04 08:27:13.017539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.521 [2024-10-04 08:27:13.017639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.521 [2024-10-04 08:27:13.017658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.521 #24 NEW cov: 11892 ft: 13494 corp: 9/604b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 1 ChangeBit- 00:09:20.522 [2024-10-04 08:27:13.057618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.522 [2024-10-04 08:27:13.057647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.522 [2024-10-04 08:27:13.057779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.522 [2024-10-04 08:27:13.057798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.522 #25 NEW cov: 11892 ft: 13538 corp: 10/657b lim: 120 exec/s: 0 rss: 67Mb L: 53/120 MS: 1 InsertByte- 00:09:20.522 [2024-10-04 08:27:13.097691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.522 [2024-10-04 08:27:13.097721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.522 [2024-10-04 08:27:13.097844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.522 [2024-10-04 08:27:13.097864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.522 #26 NEW cov: 11892 ft: 13609 corp: 11/720b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 1 ChangeBinInt- 00:09:20.522 [2024-10-04 08:27:13.137910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.522 [2024-10-04 08:27:13.137939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.522 [2024-10-04 08:27:13.138048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.522 [2024-10-04 08:27:13.138070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.522 #27 NEW cov: 11892 ft: 13688 corp: 12/783b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 1 ChangeBinInt- 00:09:20.522 [2024-10-04 08:27:13.178002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225348469472143785 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.522 [2024-10-04 08:27:13.178031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.522 [2024-10-04 08:27:13.178143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.522 [2024-10-04 08:27:13.178161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.522 #28 NEW cov: 11892 ft: 13706 corp: 13/835b lim: 120 exec/s: 0 rss: 67Mb L: 52/120 MS: 1 ChangeBit- 00:09:20.781 [2024-10-04 08:27:13.218102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.218131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.781 [2024-10-04 08:27:13.218237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.218260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.781 #34 NEW cov: 11892 ft: 13740 corp: 14/898b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 1 ChangeBinInt- 00:09:20.781 [2024-10-04 08:27:13.258291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.258323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.781 [2024-10-04 08:27:13.258438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.258456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.781 #35 NEW cov: 11892 ft: 13743 corp: 15/961b lim: 120 exec/s: 0 rss: 67Mb L: 63/120 MS: 1 ChangeBinInt- 00:09:20.781 [2024-10-04 08:27:13.298159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.298191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.781 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:20.781 #36 NEW cov: 11909 ft: 14573 corp: 16/1005b lim: 120 exec/s: 0 rss: 68Mb L: 44/120 MS: 1 EraseBytes- 00:09:20.781 [2024-10-04 08:27:13.338291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.338321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.781 #41 NEW cov: 11909 ft: 14620 corp: 17/1036b lim: 120 exec/s: 0 rss: 68Mb L: 31/120 MS: 5 CopyPart-ChangeByte-ChangeByte-ShuffleBytes-CrossOver- 00:09:20.781 [2024-10-04 08:27:13.378860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489207873350057 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.378888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.781 [2024-10-04 08:27:13.378978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.378998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.781 [2024-10-04 08:27:13.379111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.781 [2024-10-04 08:27:13.379131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:20.781 #42 NEW cov: 11909 ft: 14950 corp: 18/1111b lim: 120 exec/s: 0 rss: 68Mb L: 75/120 MS: 1 EraseBytes- 00:09:20.782 [2024-10-04 08:27:13.429077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.782 [2024-10-04 08:27:13.429107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:20.782 [2024-10-04 08:27:13.429235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.782 [2024-10-04 08:27:13.429256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:20.782 [2024-10-04 08:27:13.429388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.782 [2024-10-04 08:27:13.429408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:20.782 #43 NEW cov: 11909 ft: 15006 corp: 19/1183b lim: 120 exec/s: 43 rss: 68Mb L: 72/120 MS: 1 CopyPart- 00:09:21.041 [2024-10-04 08:27:13.468902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225348469472143785 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.041 [2024-10-04 08:27:13.468929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.041 [2024-10-04 08:27:13.469042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.041 [2024-10-04 08:27:13.469063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.041 #44 NEW cov: 11909 ft: 15031 corp: 20/1235b lim: 120 exec/s: 44 rss: 68Mb L: 52/120 MS: 1 ChangeBinInt- 00:09:21.041 [2024-10-04 08:27:13.519890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489207873350057 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.041 [2024-10-04 08:27:13.519919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.041 [2024-10-04 08:27:13.519997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.041 [2024-10-04 08:27:13.520018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.041 [2024-10-04 08:27:13.520125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12225489211083456425 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.041 [2024-10-04 08:27:13.520145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.041 [2024-10-04 08:27:13.520255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.041 [2024-10-04 08:27:13.520275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:21.041 [2024-10-04 08:27:13.520395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.041 [2024-10-04 08:27:13.520419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:21.041 #45 NEW cov: 11909 ft: 15051 corp: 21/1355b lim: 120 exec/s: 45 rss: 68Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:09:21.042 [2024-10-04 08:27:13.559201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225348469472143785 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.042 [2024-10-04 08:27:13.559233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.042 [2024-10-04 08:27:13.559348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.042 [2024-10-04 08:27:13.559382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.042 #46 NEW cov: 11909 ft: 15073 corp: 22/1408b lim: 120 exec/s: 46 rss: 68Mb L: 53/120 MS: 1 InsertByte- 00:09:21.042 [2024-10-04 08:27:13.609313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.042 [2024-10-04 08:27:13.609344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.042 [2024-10-04 08:27:13.609450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9187201951143143807 len:32640 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.042 [2024-10-04 08:27:13.609471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.042 #47 NEW cov: 11909 ft: 15087 corp: 23/1460b lim: 120 exec/s: 47 rss: 68Mb L: 52/120 MS: 1 CrossOver- 00:09:21.042 [2024-10-04 08:27:13.649533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.042 [2024-10-04 08:27:13.649570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.042 [2024-10-04 08:27:13.649702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.042 [2024-10-04 08:27:13.649721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.042 #48 NEW cov: 11918 ft: 15152 corp: 24/1513b lim: 120 exec/s: 48 rss: 68Mb L: 53/120 MS: 1 ShuffleBytes- 00:09:21.042 [2024-10-04 08:27:13.699528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.042 [2024-10-04 08:27:13.699558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.042 [2024-10-04 08:27:13.699684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.042 [2024-10-04 08:27:13.699705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.301 #49 NEW cov: 11918 ft: 15170 corp: 25/1576b lim: 120 exec/s: 49 rss: 68Mb L: 63/120 MS: 1 ShuffleBytes- 00:09:21.301 [2024-10-04 08:27:13.749756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225348469472143785 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.749788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.301 [2024-10-04 08:27:13.749919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.749939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.301 #50 NEW cov: 11918 ft: 15181 corp: 26/1628b lim: 120 exec/s: 50 rss: 68Mb L: 52/120 MS: 1 CopyPart- 00:09:21.301 [2024-10-04 08:27:13.789833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.789864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.301 [2024-10-04 08:27:13.789995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.790017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.301 #51 NEW cov: 11918 ft: 15221 corp: 27/1691b lim: 120 exec/s: 51 rss: 68Mb L: 63/120 MS: 1 ChangeByte- 00:09:21.301 [2024-10-04 08:27:13.829746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.829773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.301 #52 NEW cov: 11918 ft: 15235 corp: 28/1722b lim: 120 exec/s: 52 rss: 68Mb L: 31/120 MS: 1 CrossOver- 00:09:21.301 [2024-10-04 08:27:13.879906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.879935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.301 #53 NEW cov: 11918 ft: 15323 corp: 29/1754b lim: 120 exec/s: 53 rss: 68Mb L: 32/120 MS: 1 InsertByte- 00:09:21.301 [2024-10-04 08:27:13.921115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489207873350057 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.921148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.301 [2024-10-04 08:27:13.921256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.921276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.301 [2024-10-04 08:27:13.921401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12225489211083456425 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.921421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.301 [2024-10-04 08:27:13.921529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.921547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:21.301 [2024-10-04 08:27:13.921662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.921682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:21.301 #54 NEW cov: 11918 ft: 15362 corp: 30/1874b lim: 120 exec/s: 54 rss: 68Mb L: 120/120 MS: 1 ChangeByte- 00:09:21.301 [2024-10-04 08:27:13.970437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12224363307053656489 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.970467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.301 [2024-10-04 08:27:13.970572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.301 [2024-10-04 08:27:13.970592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.560 #55 NEW cov: 11918 ft: 15429 corp: 31/1937b lim: 120 exec/s: 55 rss: 68Mb L: 63/120 MS: 1 ChangeBinInt- 00:09:21.560 [2024-10-04 08:27:14.010531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.010562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.561 [2024-10-04 08:27:14.010684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.010705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.561 #56 NEW cov: 11918 ft: 15465 corp: 32/2001b lim: 120 exec/s: 56 rss: 68Mb L: 64/120 MS: 1 InsertByte- 00:09:21.561 [2024-10-04 08:27:14.050506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.050535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.561 #57 NEW cov: 11918 ft: 15524 corp: 33/2033b lim: 120 exec/s: 57 rss: 69Mb L: 32/120 MS: 1 InsertByte- 00:09:21.561 [2024-10-04 08:27:14.090995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225489206960499113 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.091023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.561 [2024-10-04 08:27:14.091082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.091102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.561 [2024-10-04 08:27:14.091228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7885078839350357357 len:28014 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.091251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.561 #58 NEW cov: 11918 ft: 15533 corp: 34/2105b lim: 120 exec/s: 58 rss: 69Mb L: 72/120 MS: 1 InsertRepeatedBytes- 00:09:21.561 [2024-10-04 08:27:14.130894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225348469479024960 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.130926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.561 [2024-10-04 08:27:14.131028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.131050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.561 #59 NEW cov: 11918 ft: 15560 corp: 35/2157b lim: 120 exec/s: 59 rss: 69Mb L: 52/120 MS: 1 ShuffleBytes- 00:09:21.561 [2024-10-04 08:27:14.170944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225348469479024960 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.170973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.561 [2024-10-04 08:27:14.171108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.171128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.561 #60 NEW cov: 11918 ft: 15569 corp: 36/2209b lim: 120 exec/s: 60 rss: 69Mb L: 52/120 MS: 1 ChangeByte- 00:09:21.561 [2024-10-04 08:27:14.211429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069586616319 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.211458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.561 [2024-10-04 08:27:14.211577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.211598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.561 [2024-10-04 08:27:14.211716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.561 [2024-10-04 08:27:14.211737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.561 #61 NEW cov: 11918 ft: 15574 corp: 37/2303b lim: 120 exec/s: 61 rss: 69Mb L: 94/120 MS: 1 InsertRepeatedBytes- 00:09:21.820 [2024-10-04 08:27:14.251230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12225348469479024960 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.251260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.251365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.251392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.821 #62 NEW cov: 11918 ft: 15586 corp: 38/2357b lim: 120 exec/s: 62 rss: 69Mb L: 54/120 MS: 1 CopyPart- 00:09:21.821 [2024-10-04 08:27:14.291480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12224363307053656489 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.291514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.291618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.291641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.821 #63 NEW cov: 11925 ft: 15609 corp: 39/2420b lim: 120 exec/s: 63 rss: 69Mb L: 63/120 MS: 1 ChangeBinInt- 00:09:21.821 [2024-10-04 08:27:14.331547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:6221254862804965206 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.331580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.331679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.331701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.372227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:6221254862804965206 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.372257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.372322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.372341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.372454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12225398693199173464 len:22103 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.372476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.372593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56541 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.372614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:21.821 #65 NEW cov: 11925 ft: 15621 corp: 40/2516b lim: 120 exec/s: 65 rss: 69Mb L: 96/120 MS: 2 CopyPart-InsertRepeatedBytes- 00:09:21.821 [2024-10-04 08:27:14.412006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069586616319 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.412038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.412117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.412138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:21.821 [2024-10-04 08:27:14.412265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.821 [2024-10-04 08:27:14.412289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:21.821 #66 NEW cov: 11925 ft: 15626 corp: 41/2610b lim: 120 exec/s: 33 rss: 69Mb L: 94/120 MS: 1 ChangeBit- 00:09:21.821 #66 DONE cov: 11925 ft: 15626 corp: 41/2610b lim: 120 exec/s: 33 rss: 69Mb 00:09:21.821 Done 66 runs in 2 second(s) 00:09:22.080 08:27:14 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:09:22.080 08:27:14 -- ../common.sh@72 -- # (( i++ )) 00:09:22.080 08:27:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:22.080 08:27:14 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:09:22.081 08:27:14 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:09:22.081 08:27:14 -- nvmf/run.sh@24 -- # local timen=1 00:09:22.081 08:27:14 -- nvmf/run.sh@25 -- # local core=0x1 00:09:22.081 08:27:14 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:22.081 08:27:14 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:09:22.081 08:27:14 -- nvmf/run.sh@29 -- # printf %02d 18 00:09:22.081 08:27:14 -- nvmf/run.sh@29 -- # port=4418 00:09:22.081 08:27:14 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:22.081 08:27:14 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:09:22.081 08:27:14 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:22.081 08:27:14 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:09:22.081 [2024-10-04 08:27:14.577868] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:22.081 [2024-10-04 08:27:14.577922] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1020703 ] 00:09:22.081 EAL: No free 2048 kB hugepages reported on node 1 00:09:22.081 [2024-10-04 08:27:14.748235] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:22.341 [2024-10-04 08:27:14.767515] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:22.341 [2024-10-04 08:27:14.767633] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.341 [2024-10-04 08:27:14.818876] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:22.341 [2024-10-04 08:27:14.835195] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:09:22.341 INFO: Running with entropic power schedule (0xFF, 100). 00:09:22.341 INFO: Seed: 83907448 00:09:22.341 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:22.341 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:22.341 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:22.341 INFO: A corpus is not provided, starting from an empty corpus 00:09:22.341 #2 INITED exec/s: 0 rss: 60Mb 00:09:22.341 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:22.341 This may also happen if the target rejected all inputs we tried so far 00:09:22.341 [2024-10-04 08:27:14.879997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.341 [2024-10-04 08:27:14.880030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.341 [2024-10-04 08:27:14.880061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.341 [2024-10-04 08:27:14.880077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.341 [2024-10-04 08:27:14.880110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.341 [2024-10-04 08:27:14.880124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.600 NEW_FUNC[1/670]: 0x46ea18 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:09:22.600 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:22.600 #4 NEW cov: 11632 ft: 11629 corp: 2/75b lim: 100 exec/s: 0 rss: 67Mb L: 74/74 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:22.600 [2024-10-04 08:27:15.200808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.600 [2024-10-04 08:27:15.200849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.600 [2024-10-04 08:27:15.200885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.600 [2024-10-04 08:27:15.200901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.600 [2024-10-04 08:27:15.200929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.600 [2024-10-04 08:27:15.200944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.600 #10 NEW cov: 11745 ft: 12141 corp: 3/149b lim: 100 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 ChangeByte- 00:09:22.600 [2024-10-04 08:27:15.270869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.600 [2024-10-04 08:27:15.270900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.600 [2024-10-04 08:27:15.270933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.601 [2024-10-04 08:27:15.270949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.601 [2024-10-04 08:27:15.270978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.601 [2024-10-04 08:27:15.270993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.860 #11 NEW cov: 11751 ft: 12457 corp: 4/223b lim: 100 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 ChangeBit- 00:09:22.860 [2024-10-04 08:27:15.330996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.860 [2024-10-04 08:27:15.331025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.860 [2024-10-04 08:27:15.331056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.860 [2024-10-04 08:27:15.331071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.860 [2024-10-04 08:27:15.331098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.860 [2024-10-04 08:27:15.331113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.860 #12 NEW cov: 11836 ft: 12737 corp: 5/293b lim: 100 exec/s: 0 rss: 67Mb L: 70/74 MS: 1 EraseBytes- 00:09:22.860 [2024-10-04 08:27:15.381097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.860 [2024-10-04 08:27:15.381125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.860 [2024-10-04 08:27:15.381155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.860 [2024-10-04 08:27:15.381171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.860 [2024-10-04 08:27:15.381210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.860 [2024-10-04 08:27:15.381225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.860 #13 NEW cov: 11836 ft: 12853 corp: 6/363b lim: 100 exec/s: 0 rss: 68Mb L: 70/74 MS: 1 ChangeBinInt- 00:09:22.860 [2024-10-04 08:27:15.441283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.860 [2024-10-04 08:27:15.441311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.860 [2024-10-04 08:27:15.441341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.860 [2024-10-04 08:27:15.441357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.860 [2024-10-04 08:27:15.441385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.860 [2024-10-04 08:27:15.441400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.860 #14 NEW cov: 11836 ft: 12977 corp: 7/437b lim: 100 exec/s: 0 rss: 68Mb L: 74/74 MS: 1 ShuffleBytes- 00:09:22.860 [2024-10-04 08:27:15.501454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:22.860 [2024-10-04 08:27:15.501482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:22.860 [2024-10-04 08:27:15.501513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:22.860 [2024-10-04 08:27:15.501529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:22.860 [2024-10-04 08:27:15.501557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:22.860 [2024-10-04 08:27:15.501571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:22.860 #15 NEW cov: 11836 ft: 13033 corp: 8/511b lim: 100 exec/s: 0 rss: 68Mb L: 74/74 MS: 1 ShuffleBytes- 00:09:23.120 [2024-10-04 08:27:15.551571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.120 [2024-10-04 08:27:15.551599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.551631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.120 [2024-10-04 08:27:15.551646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.551672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.120 [2024-10-04 08:27:15.551687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.120 #16 NEW cov: 11836 ft: 13083 corp: 9/581b lim: 100 exec/s: 0 rss: 68Mb L: 70/74 MS: 1 ChangeBit- 00:09:23.120 [2024-10-04 08:27:15.601717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.120 [2024-10-04 08:27:15.601745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.601776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.120 [2024-10-04 08:27:15.601792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.601819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.120 [2024-10-04 08:27:15.601841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.601869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:23.120 [2024-10-04 08:27:15.601883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:23.120 #17 NEW cov: 11836 ft: 13470 corp: 10/669b lim: 100 exec/s: 0 rss: 68Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:09:23.120 [2024-10-04 08:27:15.661863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.120 [2024-10-04 08:27:15.661892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.661923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.120 [2024-10-04 08:27:15.661938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.661966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.120 [2024-10-04 08:27:15.661980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.120 #18 NEW cov: 11836 ft: 13546 corp: 11/739b lim: 100 exec/s: 0 rss: 68Mb L: 70/88 MS: 1 ChangeBit- 00:09:23.120 [2024-10-04 08:27:15.722087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.120 [2024-10-04 08:27:15.722117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.722150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.120 [2024-10-04 08:27:15.722166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.722204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.120 [2024-10-04 08:27:15.722220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.120 #19 NEW cov: 11836 ft: 13591 corp: 12/813b lim: 100 exec/s: 0 rss: 68Mb L: 74/88 MS: 1 CopyPart- 00:09:23.120 [2024-10-04 08:27:15.772146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.120 [2024-10-04 08:27:15.772176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.772216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.120 [2024-10-04 08:27:15.772231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.772259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.120 [2024-10-04 08:27:15.772273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.120 [2024-10-04 08:27:15.772300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:23.120 [2024-10-04 08:27:15.772314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:23.380 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:23.380 #20 NEW cov: 11859 ft: 13639 corp: 13/895b lim: 100 exec/s: 0 rss: 68Mb L: 82/88 MS: 1 CrossOver- 00:09:23.380 [2024-10-04 08:27:15.842332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.380 [2024-10-04 08:27:15.842362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.842398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.380 [2024-10-04 08:27:15.842413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.842441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.380 [2024-10-04 08:27:15.842456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.380 #21 NEW cov: 11859 ft: 13644 corp: 14/969b lim: 100 exec/s: 21 rss: 68Mb L: 74/88 MS: 1 ChangeBinInt- 00:09:23.380 [2024-10-04 08:27:15.893284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.380 [2024-10-04 08:27:15.893319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.893377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.380 [2024-10-04 08:27:15.893395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.893449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.380 [2024-10-04 08:27:15.893467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.893523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:23.380 [2024-10-04 08:27:15.893541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:23.380 #22 NEW cov: 11859 ft: 13755 corp: 15/1051b lim: 100 exec/s: 22 rss: 68Mb L: 82/88 MS: 1 CopyPart- 00:09:23.380 [2024-10-04 08:27:15.943230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.380 [2024-10-04 08:27:15.943257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.943289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.380 [2024-10-04 08:27:15.943303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.943353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.380 [2024-10-04 08:27:15.943367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.380 #28 NEW cov: 11859 ft: 13820 corp: 16/1121b lim: 100 exec/s: 28 rss: 68Mb L: 70/88 MS: 1 ShuffleBytes- 00:09:23.380 [2024-10-04 08:27:15.983317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.380 [2024-10-04 08:27:15.983344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.983385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.380 [2024-10-04 08:27:15.983399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:15.983447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.380 [2024-10-04 08:27:15.983461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.380 #29 NEW cov: 11859 ft: 13884 corp: 17/1191b lim: 100 exec/s: 29 rss: 68Mb L: 70/88 MS: 1 ChangeBinInt- 00:09:23.380 [2024-10-04 08:27:16.023439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.380 [2024-10-04 08:27:16.023466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:16.023515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.380 [2024-10-04 08:27:16.023529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.380 [2024-10-04 08:27:16.023581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.380 [2024-10-04 08:27:16.023594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.380 #30 NEW cov: 11859 ft: 13927 corp: 18/1261b lim: 100 exec/s: 30 rss: 68Mb L: 70/88 MS: 1 ChangeBit- 00:09:23.640 [2024-10-04 08:27:16.063601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.640 [2024-10-04 08:27:16.063628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.640 [2024-10-04 08:27:16.063661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.640 [2024-10-04 08:27:16.063674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.640 [2024-10-04 08:27:16.063726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.640 [2024-10-04 08:27:16.063741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.640 #31 NEW cov: 11859 ft: 13941 corp: 19/1335b lim: 100 exec/s: 31 rss: 68Mb L: 74/88 MS: 1 ChangeBinInt- 00:09:23.640 [2024-10-04 08:27:16.103709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.640 [2024-10-04 08:27:16.103736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.640 [2024-10-04 08:27:16.103772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.640 [2024-10-04 08:27:16.103786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.640 [2024-10-04 08:27:16.103836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.641 [2024-10-04 08:27:16.103851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.641 #32 NEW cov: 11859 ft: 13955 corp: 20/1409b lim: 100 exec/s: 32 rss: 68Mb L: 74/88 MS: 1 ChangeBit- 00:09:23.641 [2024-10-04 08:27:16.143819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.641 [2024-10-04 08:27:16.143846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.143877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.641 [2024-10-04 08:27:16.143891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.143942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.641 [2024-10-04 08:27:16.143954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.641 #35 NEW cov: 11859 ft: 13958 corp: 21/1474b lim: 100 exec/s: 35 rss: 68Mb L: 65/88 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:09:23.641 [2024-10-04 08:27:16.183929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.641 [2024-10-04 08:27:16.183956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.183992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.641 [2024-10-04 08:27:16.184010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.184064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.641 [2024-10-04 08:27:16.184078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.641 #36 NEW cov: 11859 ft: 13967 corp: 22/1547b lim: 100 exec/s: 36 rss: 68Mb L: 73/88 MS: 1 CopyPart- 00:09:23.641 [2024-10-04 08:27:16.224163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.641 [2024-10-04 08:27:16.224193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.224232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.641 [2024-10-04 08:27:16.224247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.224296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.641 [2024-10-04 08:27:16.224310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.224360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:23.641 [2024-10-04 08:27:16.224375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:23.641 #37 NEW cov: 11859 ft: 13974 corp: 23/1645b lim: 100 exec/s: 37 rss: 68Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:09:23.641 [2024-10-04 08:27:16.264130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.641 [2024-10-04 08:27:16.264157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.264195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.641 [2024-10-04 08:27:16.264209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.264259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.641 [2024-10-04 08:27:16.264274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.641 #38 NEW cov: 11859 ft: 13994 corp: 24/1724b lim: 100 exec/s: 38 rss: 68Mb L: 79/98 MS: 1 CopyPart- 00:09:23.641 [2024-10-04 08:27:16.304384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.641 [2024-10-04 08:27:16.304411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.304453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.641 [2024-10-04 08:27:16.304467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.304516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.641 [2024-10-04 08:27:16.304531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.641 [2024-10-04 08:27:16.304580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:23.641 [2024-10-04 08:27:16.304594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:23.901 #39 NEW cov: 11859 ft: 14020 corp: 25/1807b lim: 100 exec/s: 39 rss: 68Mb L: 83/98 MS: 1 InsertByte- 00:09:23.901 [2024-10-04 08:27:16.344384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.901 [2024-10-04 08:27:16.344410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.344457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.901 [2024-10-04 08:27:16.344470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.344519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.901 [2024-10-04 08:27:16.344533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.901 #40 NEW cov: 11859 ft: 14033 corp: 26/1879b lim: 100 exec/s: 40 rss: 68Mb L: 72/98 MS: 1 CMP- DE: "\004\000"- 00:09:23.901 [2024-10-04 08:27:16.384485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.901 [2024-10-04 08:27:16.384512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.384554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.901 [2024-10-04 08:27:16.384568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.384618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.901 [2024-10-04 08:27:16.384632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.901 #41 NEW cov: 11859 ft: 14070 corp: 27/1951b lim: 100 exec/s: 41 rss: 68Mb L: 72/98 MS: 1 PersAutoDict- DE: "\004\000"- 00:09:23.901 [2024-10-04 08:27:16.424712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.901 [2024-10-04 08:27:16.424739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.424786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.901 [2024-10-04 08:27:16.424799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.424848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.901 [2024-10-04 08:27:16.424863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.424909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:23.901 [2024-10-04 08:27:16.424923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:23.901 #42 NEW cov: 11859 ft: 14118 corp: 28/2039b lim: 100 exec/s: 42 rss: 68Mb L: 88/98 MS: 1 CrossOver- 00:09:23.901 [2024-10-04 08:27:16.464728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.901 [2024-10-04 08:27:16.464755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.464793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.901 [2024-10-04 08:27:16.464807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.464858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.901 [2024-10-04 08:27:16.464873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.901 #43 NEW cov: 11859 ft: 14144 corp: 29/2113b lim: 100 exec/s: 43 rss: 68Mb L: 74/98 MS: 1 CopyPart- 00:09:23.901 [2024-10-04 08:27:16.504967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.901 [2024-10-04 08:27:16.504993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.505036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.901 [2024-10-04 08:27:16.505050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.505102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.901 [2024-10-04 08:27:16.505116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.505169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:23.901 [2024-10-04 08:27:16.505184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:23.901 #44 NEW cov: 11859 ft: 14156 corp: 30/2196b lim: 100 exec/s: 44 rss: 68Mb L: 83/98 MS: 1 CrossOver- 00:09:23.901 [2024-10-04 08:27:16.545048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:23.901 [2024-10-04 08:27:16.545073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.545121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:23.901 [2024-10-04 08:27:16.545135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.545184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:23.901 [2024-10-04 08:27:16.545204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:23.901 [2024-10-04 08:27:16.545256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:23.901 [2024-10-04 08:27:16.545270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:23.901 #45 NEW cov: 11859 ft: 14190 corp: 31/2276b lim: 100 exec/s: 45 rss: 68Mb L: 80/98 MS: 1 InsertRepeatedBytes- 00:09:24.161 [2024-10-04 08:27:16.585061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:24.161 [2024-10-04 08:27:16.585088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.585127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:24.161 [2024-10-04 08:27:16.585141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.585196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:24.161 [2024-10-04 08:27:16.585211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.161 #46 NEW cov: 11859 ft: 14202 corp: 32/2351b lim: 100 exec/s: 46 rss: 68Mb L: 75/98 MS: 1 InsertByte- 00:09:24.161 [2024-10-04 08:27:16.615048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:24.161 [2024-10-04 08:27:16.615074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.615108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:24.161 [2024-10-04 08:27:16.615122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.161 #48 NEW cov: 11859 ft: 14547 corp: 33/2399b lim: 100 exec/s: 48 rss: 68Mb L: 48/98 MS: 2 ChangeBit-InsertRepeatedBytes- 00:09:24.161 [2024-10-04 08:27:16.655265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:24.161 [2024-10-04 08:27:16.655291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.655332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:24.161 [2024-10-04 08:27:16.655346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.655394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:24.161 [2024-10-04 08:27:16.655409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.161 #49 NEW cov: 11859 ft: 14552 corp: 34/2473b lim: 100 exec/s: 49 rss: 68Mb L: 74/98 MS: 1 ShuffleBytes- 00:09:24.161 [2024-10-04 08:27:16.695514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:24.161 [2024-10-04 08:27:16.695540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.695579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:24.161 [2024-10-04 08:27:16.695592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.695640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:24.161 [2024-10-04 08:27:16.695654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.695703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:24.161 [2024-10-04 08:27:16.695716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:24.161 #50 NEW cov: 11859 ft: 14553 corp: 35/2553b lim: 100 exec/s: 50 rss: 68Mb L: 80/98 MS: 1 EraseBytes- 00:09:24.161 [2024-10-04 08:27:16.735614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:24.161 [2024-10-04 08:27:16.735641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.735684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:24.161 [2024-10-04 08:27:16.735699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.161 [2024-10-04 08:27:16.735746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:24.161 [2024-10-04 08:27:16.735761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.162 [2024-10-04 08:27:16.735811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:24.162 [2024-10-04 08:27:16.735824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:24.162 #51 NEW cov: 11859 ft: 14560 corp: 36/2638b lim: 100 exec/s: 51 rss: 68Mb L: 85/98 MS: 1 PersAutoDict- DE: "\004\000"- 00:09:24.162 [2024-10-04 08:27:16.775543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:24.162 [2024-10-04 08:27:16.775569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.162 [2024-10-04 08:27:16.775602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:24.162 [2024-10-04 08:27:16.775620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.162 #54 NEW cov: 11859 ft: 14588 corp: 37/2688b lim: 100 exec/s: 54 rss: 69Mb L: 50/98 MS: 3 CopyPart-InsertByte-InsertRepeatedBytes- 00:09:24.162 [2024-10-04 08:27:16.815862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:24.162 [2024-10-04 08:27:16.815887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.162 [2024-10-04 08:27:16.815922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:24.162 [2024-10-04 08:27:16.815934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.162 [2024-10-04 08:27:16.815983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:24.162 [2024-10-04 08:27:16.815997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.162 [2024-10-04 08:27:16.816047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:24.162 [2024-10-04 08:27:16.816061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:24.421 #55 NEW cov: 11859 ft: 14599 corp: 38/2769b lim: 100 exec/s: 55 rss: 69Mb L: 81/98 MS: 1 CopyPart- 00:09:24.421 [2024-10-04 08:27:16.856029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:24.421 [2024-10-04 08:27:16.856054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.421 [2024-10-04 08:27:16.856088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:24.421 [2024-10-04 08:27:16.856101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.421 [2024-10-04 08:27:16.856149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:24.421 [2024-10-04 08:27:16.856165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.421 [2024-10-04 08:27:16.856217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:24.421 [2024-10-04 08:27:16.856231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:24.421 #56 NEW cov: 11859 ft: 14604 corp: 39/2865b lim: 100 exec/s: 28 rss: 69Mb L: 96/98 MS: 1 CrossOver- 00:09:24.421 #56 DONE cov: 11859 ft: 14604 corp: 39/2865b lim: 100 exec/s: 28 rss: 69Mb 00:09:24.421 ###### Recommended dictionary. ###### 00:09:24.421 "\004\000" # Uses: 2 00:09:24.421 ###### End of recommended dictionary. ###### 00:09:24.421 Done 56 runs in 2 second(s) 00:09:24.421 08:27:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:09:24.421 08:27:16 -- ../common.sh@72 -- # (( i++ )) 00:09:24.421 08:27:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:24.421 08:27:16 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:09:24.421 08:27:16 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:09:24.421 08:27:16 -- nvmf/run.sh@24 -- # local timen=1 00:09:24.421 08:27:16 -- nvmf/run.sh@25 -- # local core=0x1 00:09:24.421 08:27:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:24.421 08:27:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:09:24.421 08:27:16 -- nvmf/run.sh@29 -- # printf %02d 19 00:09:24.421 08:27:16 -- nvmf/run.sh@29 -- # port=4419 00:09:24.421 08:27:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:24.421 08:27:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:09:24.421 08:27:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:24.421 08:27:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:09:24.421 [2024-10-04 08:27:17.027830] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:24.421 [2024-10-04 08:27:17.027917] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1021011 ] 00:09:24.421 EAL: No free 2048 kB hugepages reported on node 1 00:09:24.680 [2024-10-04 08:27:17.206711] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:24.680 [2024-10-04 08:27:17.225793] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:24.681 [2024-10-04 08:27:17.225912] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.681 [2024-10-04 08:27:17.277601] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:24.681 [2024-10-04 08:27:17.293926] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:09:24.681 INFO: Running with entropic power schedule (0xFF, 100). 00:09:24.681 INFO: Seed: 2541911574 00:09:24.681 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:24.681 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:24.681 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:24.681 INFO: A corpus is not provided, starting from an empty corpus 00:09:24.681 #2 INITED exec/s: 0 rss: 59Mb 00:09:24.681 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:24.681 This may also happen if the target rejected all inputs we tried so far 00:09:24.681 [2024-10-04 08:27:17.360338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:24.681 [2024-10-04 08:27:17.360381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:24.681 [2024-10-04 08:27:17.360483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:24.681 [2024-10-04 08:27:17.360506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:24.681 [2024-10-04 08:27:17.360623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:14136 00:09:24.681 [2024-10-04 08:27:17.360646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:24.681 [2024-10-04 08:27:17.360762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709506094217015 len:14136 00:09:24.681 [2024-10-04 08:27:17.360782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.199 NEW_FUNC[1/670]: 0x4719d8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:09:25.199 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:25.199 #5 NEW cov: 11604 ft: 11604 corp: 2/45b lim: 50 exec/s: 0 rss: 67Mb L: 44/44 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:09:25.199 [2024-10-04 08:27:17.671251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.199 [2024-10-04 08:27:17.671299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.671439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.199 [2024-10-04 08:27:17.671474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.671604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709269871015735 len:1 00:09:25.199 [2024-10-04 08:27:17.671629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.671759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14136 00:09:25.199 [2024-10-04 08:27:17.671789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.671928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3978709506094217015 len:14136 00:09:25.199 [2024-10-04 08:27:17.671953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.199 #6 NEW cov: 11723 ft: 12239 corp: 3/95b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:09:25.199 [2024-10-04 08:27:17.731347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.199 [2024-10-04 08:27:17.731384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.731476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.199 [2024-10-04 08:27:17.731498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.731613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709269871015735 len:1 00:09:25.199 [2024-10-04 08:27:17.731633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.731751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14136 00:09:25.199 [2024-10-04 08:27:17.731774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.731901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3834029160451749173 len:13621 00:09:25.199 [2024-10-04 08:27:17.731922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.199 #7 NEW cov: 11729 ft: 12472 corp: 4/145b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeASCIIInt- 00:09:25.199 [2024-10-04 08:27:17.781136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14136 00:09:25.199 [2024-10-04 08:27:17.781167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.781253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.199 [2024-10-04 08:27:17.781274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.781390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:1 00:09:25.199 [2024-10-04 08:27:17.781413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.199 #12 NEW cov: 11814 ft: 13054 corp: 5/184b lim: 50 exec/s: 0 rss: 67Mb L: 39/50 MS: 5 ShuffleBytes-ChangeBit-ChangeByte-CopyPart-CrossOver- 00:09:25.199 [2024-10-04 08:27:17.831410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14136 00:09:25.199 [2024-10-04 08:27:17.831441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.831570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.199 [2024-10-04 08:27:17.831595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.831716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:1 00:09:25.199 [2024-10-04 08:27:17.831738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.831865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851520 len:14081 00:09:25.199 [2024-10-04 08:27:17.831883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.199 #13 NEW cov: 11814 ft: 13154 corp: 6/228b lim: 50 exec/s: 0 rss: 67Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:09:25.199 [2024-10-04 08:27:17.871177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14282 00:09:25.199 [2024-10-04 08:27:17.871212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.871320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709508536452919 len:14136 00:09:25.199 [2024-10-04 08:27:17.871340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.871452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:1 00:09:25.199 [2024-10-04 08:27:17.871473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.199 [2024-10-04 08:27:17.871590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851520 len:14081 00:09:25.199 [2024-10-04 08:27:17.871612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.459 #14 NEW cov: 11814 ft: 13333 corp: 7/272b lim: 50 exec/s: 0 rss: 67Mb L: 44/50 MS: 1 ChangeBinInt- 00:09:25.459 [2024-10-04 08:27:17.910713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:19000 00:09:25.459 [2024-10-04 08:27:17.910743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.459 #15 NEW cov: 11814 ft: 13716 corp: 8/284b lim: 50 exec/s: 0 rss: 67Mb L: 12/50 MS: 1 CrossOver- 00:09:25.459 [2024-10-04 08:27:17.962034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.459 [2024-10-04 08:27:17.962064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:17.962167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.459 [2024-10-04 08:27:17.962192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:17.962315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709269871015735 len:1 00:09:25.459 [2024-10-04 08:27:17.962339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:17.962465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14199 00:09:25.459 [2024-10-04 08:27:17.962489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:17.962623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3978709506094217015 len:14136 00:09:25.459 [2024-10-04 08:27:17.962648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.459 #16 NEW cov: 11814 ft: 13742 corp: 9/334b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:09:25.459 [2024-10-04 08:27:18.012177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.459 [2024-10-04 08:27:18.012211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:18.012304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709866871469879 len:14136 00:09:25.459 [2024-10-04 08:27:18.012327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:18.012447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709269871015735 len:1 00:09:25.459 [2024-10-04 08:27:18.012469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:18.012599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14136 00:09:25.459 [2024-10-04 08:27:18.012618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:18.012738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3834029160451749173 len:13621 00:09:25.459 [2024-10-04 08:27:18.012761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.459 #17 NEW cov: 11814 ft: 13819 corp: 10/384b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:09:25.459 [2024-10-04 08:27:18.061747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14155 00:09:25.459 [2024-10-04 08:27:18.061778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.459 [2024-10-04 08:27:18.061845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.459 [2024-10-04 08:27:18.061868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.459 #18 NEW cov: 11814 ft: 14130 corp: 11/411b lim: 50 exec/s: 0 rss: 67Mb L: 27/50 MS: 1 CrossOver- 00:09:25.459 [2024-10-04 08:27:18.111909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:144175898683703296 len:14136 00:09:25.459 [2024-10-04 08:27:18.111935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.459 #19 NEW cov: 11814 ft: 14243 corp: 12/427b lim: 50 exec/s: 0 rss: 68Mb L: 16/50 MS: 1 CMP- DE: "\000\000\002\000"- 00:09:25.720 [2024-10-04 08:27:18.162213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14155 00:09:25.720 [2024-10-04 08:27:18.162246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.720 [2024-10-04 08:27:18.162401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.720 [2024-10-04 08:27:18.162424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.720 #20 NEW cov: 11814 ft: 14284 corp: 13/454b lim: 50 exec/s: 0 rss: 68Mb L: 27/50 MS: 1 ChangeBit- 00:09:25.720 [2024-10-04 08:27:18.212287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.720 [2024-10-04 08:27:18.212320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.720 [2024-10-04 08:27:18.212466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.720 [2024-10-04 08:27:18.212489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.720 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:25.720 #21 NEW cov: 11837 ft: 14322 corp: 14/482b lim: 50 exec/s: 0 rss: 68Mb L: 28/50 MS: 1 EraseBytes- 00:09:25.720 [2024-10-04 08:27:18.272771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14155 00:09:25.720 [2024-10-04 08:27:18.272805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.720 [2024-10-04 08:27:18.272878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:563187102988032 len:14136 00:09:25.720 [2024-10-04 08:27:18.272901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.720 [2024-10-04 08:27:18.273020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:14136 00:09:25.720 [2024-10-04 08:27:18.273045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.720 #22 NEW cov: 11837 ft: 14351 corp: 15/513b lim: 50 exec/s: 0 rss: 68Mb L: 31/50 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:09:25.720 [2024-10-04 08:27:18.323224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.720 [2024-10-04 08:27:18.323255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.720 [2024-10-04 08:27:18.323377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.720 [2024-10-04 08:27:18.323394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.720 [2024-10-04 08:27:18.323511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14468034801405905096 len:65025 00:09:25.720 [2024-10-04 08:27:18.323534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.720 [2024-10-04 08:27:18.323658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14136 00:09:25.720 [2024-10-04 08:27:18.323681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.720 [2024-10-04 08:27:18.323787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3834029160451749173 len:13621 00:09:25.720 [2024-10-04 08:27:18.323811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.720 #23 NEW cov: 11837 ft: 14372 corp: 16/563b lim: 50 exec/s: 23 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:09:25.720 [2024-10-04 08:27:18.372638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14155 00:09:25.720 [2024-10-04 08:27:18.372669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.720 #24 NEW cov: 11837 ft: 14389 corp: 17/579b lim: 50 exec/s: 24 rss: 68Mb L: 16/50 MS: 1 CrossOver- 00:09:25.980 [2024-10-04 08:27:18.422797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978730397133911863 len:19000 00:09:25.980 [2024-10-04 08:27:18.422824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.980 #25 NEW cov: 11837 ft: 14423 corp: 18/591b lim: 50 exec/s: 25 rss: 68Mb L: 12/50 MS: 1 CopyPart- 00:09:25.980 [2024-10-04 08:27:18.463233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.980 [2024-10-04 08:27:18.463266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.980 [2024-10-04 08:27:18.463349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.980 [2024-10-04 08:27:18.463371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.980 [2024-10-04 08:27:18.463485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:1 00:09:25.980 [2024-10-04 08:27:18.463506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.980 [2024-10-04 08:27:18.463621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14199 00:09:25.980 [2024-10-04 08:27:18.463642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.980 [2024-10-04 08:27:18.463756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3978709506094217015 len:14136 00:09:25.980 [2024-10-04 08:27:18.463777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.980 #26 NEW cov: 11837 ft: 14453 corp: 19/641b lim: 50 exec/s: 26 rss: 68Mb L: 50/50 MS: 1 CopyPart- 00:09:25.980 [2024-10-04 08:27:18.513856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.980 [2024-10-04 08:27:18.513886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.980 [2024-10-04 08:27:18.513973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.980 [2024-10-04 08:27:18.513995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.980 [2024-10-04 08:27:18.514109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14468034801405905096 len:65025 00:09:25.980 [2024-10-04 08:27:18.514129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.980 [2024-10-04 08:27:18.514254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14136 00:09:25.980 [2024-10-04 08:27:18.514277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.980 [2024-10-04 08:27:18.514401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3834029160451749173 len:9525 00:09:25.980 [2024-10-04 08:27:18.514426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.980 #27 NEW cov: 11837 ft: 14512 corp: 20/691b lim: 50 exec/s: 27 rss: 68Mb L: 50/50 MS: 1 ChangeBit- 00:09:25.980 [2024-10-04 08:27:18.574087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.981 [2024-10-04 08:27:18.574122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.981 [2024-10-04 08:27:18.574213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.981 [2024-10-04 08:27:18.574236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.981 [2024-10-04 08:27:18.574355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709269871015735 len:1 00:09:25.981 [2024-10-04 08:27:18.574376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.981 [2024-10-04 08:27:18.574494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978730396815144759 len:14136 00:09:25.981 [2024-10-04 08:27:18.574514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.981 [2024-10-04 08:27:18.574635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3978709780972123959 len:14136 00:09:25.981 [2024-10-04 08:27:18.574657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.981 #28 NEW cov: 11837 ft: 14533 corp: 21/741b lim: 50 exec/s: 28 rss: 68Mb L: 50/50 MS: 1 CrossOver- 00:09:25.981 [2024-10-04 08:27:18.624192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:25.981 [2024-10-04 08:27:18.624243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:25.981 [2024-10-04 08:27:18.624325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:25.981 [2024-10-04 08:27:18.624344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:25.981 [2024-10-04 08:27:18.624460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14468095272699873480 len:1 00:09:25.981 [2024-10-04 08:27:18.624483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:25.981 [2024-10-04 08:27:18.624605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14136 00:09:25.981 [2024-10-04 08:27:18.624623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:25.981 [2024-10-04 08:27:18.624731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3834029160451749173 len:13621 00:09:25.981 [2024-10-04 08:27:18.624755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:25.981 #29 NEW cov: 11837 ft: 14542 corp: 22/791b lim: 50 exec/s: 29 rss: 68Mb L: 50/50 MS: 1 CrossOver- 00:09:26.240 [2024-10-04 08:27:18.674395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:26.240 [2024-10-04 08:27:18.674430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.240 [2024-10-04 08:27:18.674501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:26.240 [2024-10-04 08:27:18.674523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.240 [2024-10-04 08:27:18.674637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709269871031351 len:1 00:09:26.240 [2024-10-04 08:27:18.674656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.241 [2024-10-04 08:27:18.674768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851575 len:14136 00:09:26.241 [2024-10-04 08:27:18.674789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:26.241 [2024-10-04 08:27:18.674906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3834029160451749173 len:13621 00:09:26.241 [2024-10-04 08:27:18.674927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:26.241 #30 NEW cov: 11837 ft: 14546 corp: 23/841b lim: 50 exec/s: 30 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:09:26.241 [2024-10-04 08:27:18.724355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:26.241 [2024-10-04 08:27:18.724385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.241 [2024-10-04 08:27:18.724465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:26.241 [2024-10-04 08:27:18.724485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.241 [2024-10-04 08:27:18.724594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:14136 00:09:26.241 [2024-10-04 08:27:18.724614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.241 [2024-10-04 08:27:18.724746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709506094217015 len:14136 00:09:26.241 [2024-10-04 08:27:18.724766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:26.241 #31 NEW cov: 11837 ft: 14560 corp: 24/885b lim: 50 exec/s: 31 rss: 68Mb L: 44/50 MS: 1 ShuffleBytes- 00:09:26.241 [2024-10-04 08:27:18.774336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:26.241 [2024-10-04 08:27:18.774370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.241 [2024-10-04 08:27:18.774510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:26.241 [2024-10-04 08:27:18.774531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.241 [2024-10-04 08:27:18.774645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:14136 00:09:26.241 [2024-10-04 08:27:18.774667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.241 #32 NEW cov: 11837 ft: 14573 corp: 25/920b lim: 50 exec/s: 32 rss: 68Mb L: 35/50 MS: 1 CrossOver- 00:09:26.241 [2024-10-04 08:27:18.834311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14468034532953606344 len:14136 00:09:26.241 [2024-10-04 08:27:18.834344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.241 [2024-10-04 08:27:18.834465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:26.241 [2024-10-04 08:27:18.834490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.241 #33 NEW cov: 11837 ft: 14609 corp: 26/948b lim: 50 exec/s: 33 rss: 68Mb L: 28/50 MS: 1 ChangeBinInt- 00:09:26.241 [2024-10-04 08:27:18.884334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:563187421780480 len:14136 00:09:26.241 [2024-10-04 08:27:18.884366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.241 #34 NEW cov: 11837 ft: 14622 corp: 27/965b lim: 50 exec/s: 34 rss: 68Mb L: 17/50 MS: 1 InsertByte- 00:09:26.500 [2024-10-04 08:27:18.935052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14282 00:09:26.500 [2024-10-04 08:27:18.935086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.500 [2024-10-04 08:27:18.935191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709508536452918 len:14136 00:09:26.500 [2024-10-04 08:27:18.935211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.500 [2024-10-04 08:27:18.935324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709506094217015 len:1 00:09:26.500 [2024-10-04 08:27:18.935344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.500 [2024-10-04 08:27:18.935456] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709505167851520 len:14081 00:09:26.500 [2024-10-04 08:27:18.935476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:26.500 #35 NEW cov: 11837 ft: 14636 corp: 28/1009b lim: 50 exec/s: 35 rss: 68Mb L: 44/50 MS: 1 ChangeBit- 00:09:26.500 [2024-10-04 08:27:18.994985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505818441527 len:14136 00:09:26.500 [2024-10-04 08:27:18.995019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.500 [2024-10-04 08:27:18.995103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709269871015735 len:1 00:09:26.501 [2024-10-04 08:27:18.995126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.501 [2024-10-04 08:27:18.995279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709505167851575 len:1 00:09:26.501 [2024-10-04 08:27:18.995300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.501 #36 NEW cov: 11837 ft: 14657 corp: 29/1042b lim: 50 exec/s: 36 rss: 68Mb L: 33/50 MS: 1 EraseBytes- 00:09:26.501 [2024-10-04 08:27:19.045637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:26.501 [2024-10-04 08:27:19.045671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.501 [2024-10-04 08:27:19.045738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:26.501 [2024-10-04 08:27:19.045759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.501 [2024-10-04 08:27:19.045873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709269871015735 len:1 00:09:26.501 [2024-10-04 08:27:19.045899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.501 [2024-10-04 08:27:19.046011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3978709587698595639 len:14136 00:09:26.501 [2024-10-04 08:27:19.046033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:26.501 [2024-10-04 08:27:19.046149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3978709780972123959 len:14136 00:09:26.501 [2024-10-04 08:27:19.046173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:26.501 #37 NEW cov: 11837 ft: 14687 corp: 30/1092b lim: 50 exec/s: 37 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:09:26.501 [2024-10-04 08:27:19.105157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:26.501 [2024-10-04 08:27:19.105193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.501 [2024-10-04 08:27:19.105314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506412984119 len:19000 00:09:26.501 [2024-10-04 08:27:19.105336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.501 #38 NEW cov: 11837 ft: 14701 corp: 31/1114b lim: 50 exec/s: 38 rss: 69Mb L: 22/50 MS: 1 CopyPart- 00:09:26.501 [2024-10-04 08:27:19.155308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709505810249527 len:14155 00:09:26.501 [2024-10-04 08:27:19.155339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.501 [2024-10-04 08:27:19.155466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094216999 len:14136 00:09:26.501 [2024-10-04 08:27:19.155488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.501 #39 NEW cov: 11837 ft: 14709 corp: 32/1141b lim: 50 exec/s: 39 rss: 69Mb L: 27/50 MS: 1 ChangeBit- 00:09:26.761 [2024-10-04 08:27:19.205361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3973924431808902967 len:19000 00:09:26.761 [2024-10-04 08:27:19.205391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.761 #45 NEW cov: 11837 ft: 14725 corp: 33/1153b lim: 50 exec/s: 45 rss: 69Mb L: 12/50 MS: 1 ChangeByte- 00:09:26.761 [2024-10-04 08:27:19.245752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3978709506412984119 len:14136 00:09:26.761 [2024-10-04 08:27:19.245782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.761 [2024-10-04 08:27:19.245845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709506094217015 len:14136 00:09:26.761 [2024-10-04 08:27:19.245864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.761 [2024-10-04 08:27:19.245996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14468095272699873480 len:1 00:09:26.761 [2024-10-04 08:27:19.246017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.761 [2024-10-04 08:27:19.246139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3963228381452384000 len:14136 00:09:26.761 [2024-10-04 08:27:19.246165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:26.761 [2024-10-04 08:27:19.246290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:3834029160451749173 len:13621 00:09:26.761 [2024-10-04 08:27:19.246312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:26.761 #46 NEW cov: 11837 ft: 14760 corp: 34/1203b lim: 50 exec/s: 46 rss: 69Mb L: 50/50 MS: 1 ShuffleBytes- 00:09:26.761 [2024-10-04 08:27:19.295912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4022497556394620727 len:14136 00:09:26.761 [2024-10-04 08:27:19.295942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.761 [2024-10-04 08:27:19.296050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978709269871015735 len:1 00:09:26.761 [2024-10-04 08:27:19.296078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.761 [2024-10-04 08:27:19.296204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3978709505167851575 len:1 00:09:26.761 [2024-10-04 08:27:19.296225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.761 #47 NEW cov: 11837 ft: 14766 corp: 35/1236b lim: 50 exec/s: 47 rss: 69Mb L: 33/50 MS: 1 ChangeBinInt- 00:09:26.761 [2024-10-04 08:27:19.346093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:144175898683703296 len:14136 00:09:26.761 [2024-10-04 08:27:19.346125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.761 [2024-10-04 08:27:19.346213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:09:26.761 [2024-10-04 08:27:19.346232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.761 [2024-10-04 08:27:19.346345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:09:26.761 [2024-10-04 08:27:19.346370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.761 #48 NEW cov: 11837 ft: 14778 corp: 36/1272b lim: 50 exec/s: 24 rss: 69Mb L: 36/50 MS: 1 InsertRepeatedBytes- 00:09:26.761 #48 DONE cov: 11837 ft: 14778 corp: 36/1272b lim: 50 exec/s: 24 rss: 69Mb 00:09:26.761 ###### Recommended dictionary. ###### 00:09:26.761 "\000\000\002\000" # Uses: 1 00:09:26.761 ###### End of recommended dictionary. ###### 00:09:26.761 Done 48 runs in 2 second(s) 00:09:27.021 08:27:19 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:09:27.021 08:27:19 -- ../common.sh@72 -- # (( i++ )) 00:09:27.021 08:27:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:27.021 08:27:19 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:09:27.021 08:27:19 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:09:27.021 08:27:19 -- nvmf/run.sh@24 -- # local timen=1 00:09:27.021 08:27:19 -- nvmf/run.sh@25 -- # local core=0x1 00:09:27.021 08:27:19 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:27.021 08:27:19 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:09:27.021 08:27:19 -- nvmf/run.sh@29 -- # printf %02d 20 00:09:27.021 08:27:19 -- nvmf/run.sh@29 -- # port=4420 00:09:27.021 08:27:19 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:27.021 08:27:19 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:09:27.021 08:27:19 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:27.021 08:27:19 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:09:27.021 [2024-10-04 08:27:19.522889] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:27.021 [2024-10-04 08:27:19.522973] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1021554 ] 00:09:27.021 EAL: No free 2048 kB hugepages reported on node 1 00:09:27.021 [2024-10-04 08:27:19.700259] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:27.281 [2024-10-04 08:27:19.719633] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:27.281 [2024-10-04 08:27:19.719751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.281 [2024-10-04 08:27:19.771036] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:27.281 [2024-10-04 08:27:19.787341] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:09:27.281 INFO: Running with entropic power schedule (0xFF, 100). 00:09:27.281 INFO: Seed: 738961455 00:09:27.281 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:27.281 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:27.281 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:27.281 INFO: A corpus is not provided, starting from an empty corpus 00:09:27.281 #2 INITED exec/s: 0 rss: 59Mb 00:09:27.281 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:27.281 This may also happen if the target rejected all inputs we tried so far 00:09:27.281 [2024-10-04 08:27:19.832079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.281 [2024-10-04 08:27:19.832113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.281 [2024-10-04 08:27:19.832147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.281 [2024-10-04 08:27:19.832164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.281 [2024-10-04 08:27:19.832199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.281 [2024-10-04 08:27:19.832216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.281 [2024-10-04 08:27:19.832261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:27.281 [2024-10-04 08:27:19.832277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.540 NEW_FUNC[1/672]: 0x473598 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:09:27.540 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:27.540 #5 NEW cov: 11663 ft: 11660 corp: 2/75b lim: 90 exec/s: 0 rss: 67Mb L: 74/74 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:09:27.540 [2024-10-04 08:27:20.152920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.540 [2024-10-04 08:27:20.152968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.540 [2024-10-04 08:27:20.153007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.540 [2024-10-04 08:27:20.153027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.540 [2024-10-04 08:27:20.153062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.540 [2024-10-04 08:27:20.153081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.541 [2024-10-04 08:27:20.153110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:27.541 [2024-10-04 08:27:20.153127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.541 #6 NEW cov: 11781 ft: 12225 corp: 3/149b lim: 90 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 CrossOver- 00:09:27.800 [2024-10-04 08:27:20.223000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.800 [2024-10-04 08:27:20.223035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.800 [2024-10-04 08:27:20.223070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.800 [2024-10-04 08:27:20.223087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.800 [2024-10-04 08:27:20.223118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.801 [2024-10-04 08:27:20.223136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.223166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:27.801 [2024-10-04 08:27:20.223182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.801 #7 NEW cov: 11787 ft: 12472 corp: 4/223b lim: 90 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 ChangeBinInt- 00:09:27.801 [2024-10-04 08:27:20.273083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.801 [2024-10-04 08:27:20.273118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.273153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.801 [2024-10-04 08:27:20.273172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.273211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.801 [2024-10-04 08:27:20.273229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.273259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:27.801 [2024-10-04 08:27:20.273276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.801 #8 NEW cov: 11872 ft: 12656 corp: 5/298b lim: 90 exec/s: 0 rss: 67Mb L: 75/75 MS: 1 InsertByte- 00:09:27.801 [2024-10-04 08:27:20.343203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.801 [2024-10-04 08:27:20.343235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.343267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.801 [2024-10-04 08:27:20.343283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.343311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.801 [2024-10-04 08:27:20.343327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.343360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:27.801 [2024-10-04 08:27:20.343376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.801 #9 NEW cov: 11872 ft: 12842 corp: 6/372b lim: 90 exec/s: 0 rss: 67Mb L: 74/75 MS: 1 ChangeByte- 00:09:27.801 [2024-10-04 08:27:20.403265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.801 [2024-10-04 08:27:20.403297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.403330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.801 [2024-10-04 08:27:20.403347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.801 #11 NEW cov: 11872 ft: 13504 corp: 7/422b lim: 90 exec/s: 0 rss: 67Mb L: 50/75 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:09:27.801 [2024-10-04 08:27:20.463476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:27.801 [2024-10-04 08:27:20.463507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.463540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:27.801 [2024-10-04 08:27:20.463557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.801 [2024-10-04 08:27:20.463585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:27.801 [2024-10-04 08:27:20.463601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.060 #17 NEW cov: 11872 ft: 13882 corp: 8/492b lim: 90 exec/s: 0 rss: 67Mb L: 70/75 MS: 1 InsertRepeatedBytes- 00:09:28.060 [2024-10-04 08:27:20.533541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.060 [2024-10-04 08:27:20.533572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.060 #19 NEW cov: 11872 ft: 14744 corp: 9/512b lim: 90 exec/s: 0 rss: 67Mb L: 20/75 MS: 2 ChangeBit-InsertRepeatedBytes- 00:09:28.060 [2024-10-04 08:27:20.593856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.060 [2024-10-04 08:27:20.593887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.060 [2024-10-04 08:27:20.593919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.060 [2024-10-04 08:27:20.593936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.060 [2024-10-04 08:27:20.593965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.060 [2024-10-04 08:27:20.593981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.060 [2024-10-04 08:27:20.594008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:28.060 [2024-10-04 08:27:20.594024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:28.060 #20 NEW cov: 11872 ft: 14822 corp: 10/586b lim: 90 exec/s: 0 rss: 67Mb L: 74/75 MS: 1 CopyPart- 00:09:28.060 [2024-10-04 08:27:20.643942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.060 [2024-10-04 08:27:20.643973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.060 [2024-10-04 08:27:20.644009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.060 [2024-10-04 08:27:20.644026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.060 [2024-10-04 08:27:20.644054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.060 [2024-10-04 08:27:20.644070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.060 #21 NEW cov: 11872 ft: 14892 corp: 11/653b lim: 90 exec/s: 0 rss: 67Mb L: 67/75 MS: 1 EraseBytes- 00:09:28.060 [2024-10-04 08:27:20.704106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.060 [2024-10-04 08:27:20.704138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.060 [2024-10-04 08:27:20.704173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.060 [2024-10-04 08:27:20.704199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.320 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:28.320 #22 NEW cov: 11889 ft: 14926 corp: 12/699b lim: 90 exec/s: 0 rss: 68Mb L: 46/75 MS: 1 EraseBytes- 00:09:28.320 [2024-10-04 08:27:20.774304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.320 [2024-10-04 08:27:20.774335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.774368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.320 [2024-10-04 08:27:20.774385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.774414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.320 [2024-10-04 08:27:20.774431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.320 #23 NEW cov: 11889 ft: 14950 corp: 13/758b lim: 90 exec/s: 23 rss: 68Mb L: 59/75 MS: 1 CrossOver- 00:09:28.320 [2024-10-04 08:27:20.834544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.320 [2024-10-04 08:27:20.834576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.834608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.320 [2024-10-04 08:27:20.834625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.834655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.320 [2024-10-04 08:27:20.834671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.834699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:28.320 [2024-10-04 08:27:20.834715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:28.320 #24 NEW cov: 11889 ft: 14989 corp: 14/832b lim: 90 exec/s: 24 rss: 68Mb L: 74/75 MS: 1 CopyPart- 00:09:28.320 [2024-10-04 08:27:20.884647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.320 [2024-10-04 08:27:20.884678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.884713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.320 [2024-10-04 08:27:20.884730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.884759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.320 [2024-10-04 08:27:20.884775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.884802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:28.320 [2024-10-04 08:27:20.884817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:28.320 #25 NEW cov: 11889 ft: 15057 corp: 15/918b lim: 90 exec/s: 25 rss: 68Mb L: 86/86 MS: 1 CopyPart- 00:09:28.320 [2024-10-04 08:27:20.934654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.320 [2024-10-04 08:27:20.934685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.934719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.320 [2024-10-04 08:27:20.934735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.320 #26 NEW cov: 11889 ft: 15086 corp: 16/968b lim: 90 exec/s: 26 rss: 68Mb L: 50/86 MS: 1 ChangeByte- 00:09:28.320 [2024-10-04 08:27:20.984888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.320 [2024-10-04 08:27:20.984918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.984950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.320 [2024-10-04 08:27:20.984966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.984995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.320 [2024-10-04 08:27:20.985011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.320 [2024-10-04 08:27:20.985038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:28.320 [2024-10-04 08:27:20.985054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:28.580 #27 NEW cov: 11889 ft: 15134 corp: 17/1042b lim: 90 exec/s: 27 rss: 68Mb L: 74/86 MS: 1 EraseBytes- 00:09:28.580 [2024-10-04 08:27:21.035028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.580 [2024-10-04 08:27:21.035057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.580 [2024-10-04 08:27:21.035089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.580 [2024-10-04 08:27:21.035106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.580 [2024-10-04 08:27:21.035134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.580 [2024-10-04 08:27:21.035150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.580 [2024-10-04 08:27:21.035178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:28.580 [2024-10-04 08:27:21.035200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:28.580 #28 NEW cov: 11889 ft: 15156 corp: 18/1116b lim: 90 exec/s: 28 rss: 68Mb L: 74/86 MS: 1 CrossOver- 00:09:28.580 [2024-10-04 08:27:21.095158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.580 [2024-10-04 08:27:21.095195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.580 [2024-10-04 08:27:21.095229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.580 [2024-10-04 08:27:21.095246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.580 [2024-10-04 08:27:21.095275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.580 [2024-10-04 08:27:21.095291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.580 #29 NEW cov: 11889 ft: 15195 corp: 19/1186b lim: 90 exec/s: 29 rss: 68Mb L: 70/86 MS: 1 ChangeByte- 00:09:28.580 [2024-10-04 08:27:21.155242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.580 [2024-10-04 08:27:21.155273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.580 #32 NEW cov: 11889 ft: 15254 corp: 20/1221b lim: 90 exec/s: 32 rss: 68Mb L: 35/86 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:09:28.580 [2024-10-04 08:27:21.225553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.580 [2024-10-04 08:27:21.225585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.580 [2024-10-04 08:27:21.225619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.580 [2024-10-04 08:27:21.225636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.580 [2024-10-04 08:27:21.225667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.580 [2024-10-04 08:27:21.225683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.580 #33 NEW cov: 11889 ft: 15266 corp: 21/1276b lim: 90 exec/s: 33 rss: 68Mb L: 55/86 MS: 1 CopyPart- 00:09:28.838 [2024-10-04 08:27:21.275504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.838 [2024-10-04 08:27:21.275533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.838 #34 NEW cov: 11889 ft: 15280 corp: 22/1294b lim: 90 exec/s: 34 rss: 68Mb L: 18/86 MS: 1 CrossOver- 00:09:28.838 [2024-10-04 08:27:21.345893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.838 [2024-10-04 08:27:21.345923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.838 [2024-10-04 08:27:21.345955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.838 [2024-10-04 08:27:21.345971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.838 [2024-10-04 08:27:21.345999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.838 [2024-10-04 08:27:21.346015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.838 [2024-10-04 08:27:21.346043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:28.838 [2024-10-04 08:27:21.346058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:28.838 #35 NEW cov: 11889 ft: 15290 corp: 23/1368b lim: 90 exec/s: 35 rss: 68Mb L: 74/86 MS: 1 InsertRepeatedBytes- 00:09:28.838 [2024-10-04 08:27:21.405989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.838 [2024-10-04 08:27:21.406019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.838 [2024-10-04 08:27:21.406051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.838 [2024-10-04 08:27:21.406068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.839 [2024-10-04 08:27:21.406096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.839 [2024-10-04 08:27:21.406112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.839 #36 NEW cov: 11889 ft: 15308 corp: 24/1423b lim: 90 exec/s: 36 rss: 68Mb L: 55/86 MS: 1 ChangeBinInt- 00:09:28.839 [2024-10-04 08:27:21.466233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.839 [2024-10-04 08:27:21.466264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.839 [2024-10-04 08:27:21.466297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.839 [2024-10-04 08:27:21.466314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.839 [2024-10-04 08:27:21.466345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.839 [2024-10-04 08:27:21.466362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:28.839 [2024-10-04 08:27:21.466390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:28.839 [2024-10-04 08:27:21.466407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:28.839 #37 NEW cov: 11889 ft: 15363 corp: 25/1511b lim: 90 exec/s: 37 rss: 68Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:09:28.839 [2024-10-04 08:27:21.516299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:28.839 [2024-10-04 08:27:21.516331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.839 [2024-10-04 08:27:21.516365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:28.839 [2024-10-04 08:27:21.516383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:28.839 [2024-10-04 08:27:21.516414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:28.839 [2024-10-04 08:27:21.516431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.097 #38 NEW cov: 11889 ft: 15387 corp: 26/1578b lim: 90 exec/s: 38 rss: 69Mb L: 67/88 MS: 1 ChangeBinInt- 00:09:29.097 [2024-10-04 08:27:21.586523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:29.097 [2024-10-04 08:27:21.586553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.586585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:29.097 [2024-10-04 08:27:21.586601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.586629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:29.097 [2024-10-04 08:27:21.586652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.586680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:29.097 [2024-10-04 08:27:21.586695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:29.097 #39 NEW cov: 11889 ft: 15404 corp: 27/1666b lim: 90 exec/s: 39 rss: 69Mb L: 88/88 MS: 1 ShuffleBytes- 00:09:29.097 [2024-10-04 08:27:21.646646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:29.097 [2024-10-04 08:27:21.646675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.646707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:29.097 [2024-10-04 08:27:21.646723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.646752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:29.097 [2024-10-04 08:27:21.646767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.646795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:29.097 [2024-10-04 08:27:21.646810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:29.097 #40 NEW cov: 11889 ft: 15428 corp: 28/1740b lim: 90 exec/s: 40 rss: 69Mb L: 74/88 MS: 1 ShuffleBytes- 00:09:29.097 [2024-10-04 08:27:21.696772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:29.097 [2024-10-04 08:27:21.696803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.696834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:29.097 [2024-10-04 08:27:21.696851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.696880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:29.097 [2024-10-04 08:27:21.696895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.696923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:29.097 [2024-10-04 08:27:21.696938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:29.097 #41 NEW cov: 11896 ft: 15449 corp: 29/1814b lim: 90 exec/s: 41 rss: 69Mb L: 74/88 MS: 1 ChangeBinInt- 00:09:29.097 [2024-10-04 08:27:21.746902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:29.097 [2024-10-04 08:27:21.746934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.746967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:29.097 [2024-10-04 08:27:21.746987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.747018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:29.097 [2024-10-04 08:27:21.747036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.097 [2024-10-04 08:27:21.747068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:29.097 [2024-10-04 08:27:21.747085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:29.357 #42 NEW cov: 11896 ft: 15473 corp: 30/1895b lim: 90 exec/s: 42 rss: 69Mb L: 81/88 MS: 1 InsertRepeatedBytes- 00:09:29.357 [2024-10-04 08:27:21.797033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:29.357 [2024-10-04 08:27:21.797064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.357 [2024-10-04 08:27:21.797095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:29.357 [2024-10-04 08:27:21.797111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.357 [2024-10-04 08:27:21.797140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:09:29.357 [2024-10-04 08:27:21.797156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.357 [2024-10-04 08:27:21.797183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:09:29.357 [2024-10-04 08:27:21.797206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:29.357 #43 NEW cov: 11896 ft: 15495 corp: 31/1974b lim: 90 exec/s: 21 rss: 69Mb L: 79/88 MS: 1 CrossOver- 00:09:29.357 #43 DONE cov: 11896 ft: 15495 corp: 31/1974b lim: 90 exec/s: 21 rss: 69Mb 00:09:29.357 Done 43 runs in 2 second(s) 00:09:29.357 08:27:21 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:09:29.357 08:27:21 -- ../common.sh@72 -- # (( i++ )) 00:09:29.357 08:27:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:29.357 08:27:21 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:09:29.357 08:27:21 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:09:29.357 08:27:21 -- nvmf/run.sh@24 -- # local timen=1 00:09:29.357 08:27:21 -- nvmf/run.sh@25 -- # local core=0x1 00:09:29.357 08:27:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:29.357 08:27:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:09:29.357 08:27:21 -- nvmf/run.sh@29 -- # printf %02d 21 00:09:29.357 08:27:21 -- nvmf/run.sh@29 -- # port=4421 00:09:29.357 08:27:21 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:29.357 08:27:21 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:09:29.357 08:27:21 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:29.357 08:27:21 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:09:29.357 [2024-10-04 08:27:21.985892] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:29.357 [2024-10-04 08:27:21.985963] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1021958 ] 00:09:29.357 EAL: No free 2048 kB hugepages reported on node 1 00:09:29.615 [2024-10-04 08:27:22.173925] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:29.615 [2024-10-04 08:27:22.192796] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:29.615 [2024-10-04 08:27:22.192917] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.615 [2024-10-04 08:27:22.244553] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:29.615 [2024-10-04 08:27:22.260918] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:09:29.616 INFO: Running with entropic power schedule (0xFF, 100). 00:09:29.616 INFO: Seed: 3213937839 00:09:29.616 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:29.616 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:29.616 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:29.616 INFO: A corpus is not provided, starting from an empty corpus 00:09:29.616 #2 INITED exec/s: 0 rss: 59Mb 00:09:29.616 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:29.616 This may also happen if the target rejected all inputs we tried so far 00:09:29.874 [2024-10-04 08:27:22.305442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:29.874 [2024-10-04 08:27:22.305478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.133 NEW_FUNC[1/672]: 0x4767c8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:09:30.133 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:30.133 #4 NEW cov: 11643 ft: 11644 corp: 2/11b lim: 50 exec/s: 0 rss: 67Mb L: 10/10 MS: 2 InsertByte-CMP- DE: "\000h\340\213dST."- 00:09:30.133 [2024-10-04 08:27:22.626346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.133 [2024-10-04 08:27:22.626383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.133 [2024-10-04 08:27:22.626417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.133 [2024-10-04 08:27:22.626434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.133 [2024-10-04 08:27:22.626463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.133 [2024-10-04 08:27:22.626479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.133 [2024-10-04 08:27:22.626506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.133 [2024-10-04 08:27:22.626522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.133 #5 NEW cov: 11756 ft: 13079 corp: 3/55b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:09:30.133 [2024-10-04 08:27:22.686229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.133 [2024-10-04 08:27:22.686260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.133 #11 NEW cov: 11762 ft: 13420 corp: 4/65b lim: 50 exec/s: 0 rss: 68Mb L: 10/44 MS: 1 ShuffleBytes- 00:09:30.133 [2024-10-04 08:27:22.746472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.134 [2024-10-04 08:27:22.746503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.134 [2024-10-04 08:27:22.746536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.134 [2024-10-04 08:27:22.746553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.134 #12 NEW cov: 11847 ft: 13938 corp: 5/85b lim: 50 exec/s: 0 rss: 68Mb L: 20/44 MS: 1 CrossOver- 00:09:30.393 [2024-10-04 08:27:22.816772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.393 [2024-10-04 08:27:22.816804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:22.816843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.393 [2024-10-04 08:27:22.816861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:22.816892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.393 [2024-10-04 08:27:22.816910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:22.816939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.393 [2024-10-04 08:27:22.816956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.393 #13 NEW cov: 11847 ft: 14098 corp: 6/129b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 PersAutoDict- DE: "\000h\340\213dST."- 00:09:30.393 [2024-10-04 08:27:22.886723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.393 [2024-10-04 08:27:22.886753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.393 #14 NEW cov: 11847 ft: 14157 corp: 7/139b lim: 50 exec/s: 0 rss: 68Mb L: 10/44 MS: 1 ShuffleBytes- 00:09:30.393 [2024-10-04 08:27:22.936861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.393 [2024-10-04 08:27:22.936892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.393 #15 NEW cov: 11847 ft: 14306 corp: 8/149b lim: 50 exec/s: 0 rss: 68Mb L: 10/44 MS: 1 ChangeByte- 00:09:30.393 [2024-10-04 08:27:22.997175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.393 [2024-10-04 08:27:22.997213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:22.997245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.393 [2024-10-04 08:27:22.997262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:22.997291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.393 [2024-10-04 08:27:22.997307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:22.997334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.393 [2024-10-04 08:27:22.997350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.393 #16 NEW cov: 11847 ft: 14378 corp: 9/193b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 ShuffleBytes- 00:09:30.393 [2024-10-04 08:27:23.067369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.393 [2024-10-04 08:27:23.067400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:23.067433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.393 [2024-10-04 08:27:23.067449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:23.067479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.393 [2024-10-04 08:27:23.067496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.393 [2024-10-04 08:27:23.067524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.393 [2024-10-04 08:27:23.067544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.653 #22 NEW cov: 11847 ft: 14473 corp: 10/237b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 ChangeByte- 00:09:30.653 [2024-10-04 08:27:23.117420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.653 [2024-10-04 08:27:23.117452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.653 [2024-10-04 08:27:23.117486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.653 [2024-10-04 08:27:23.117503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.653 #23 NEW cov: 11847 ft: 14507 corp: 11/257b lim: 50 exec/s: 0 rss: 68Mb L: 20/44 MS: 1 CopyPart- 00:09:30.653 [2024-10-04 08:27:23.177682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.653 [2024-10-04 08:27:23.177713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.653 [2024-10-04 08:27:23.177744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.653 [2024-10-04 08:27:23.177761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.653 [2024-10-04 08:27:23.177790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.653 [2024-10-04 08:27:23.177806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.653 [2024-10-04 08:27:23.177833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.653 [2024-10-04 08:27:23.177849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.653 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:30.653 #24 NEW cov: 11870 ft: 14633 corp: 12/301b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 ChangeBit- 00:09:30.653 [2024-10-04 08:27:23.247789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.653 [2024-10-04 08:27:23.247822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.653 [2024-10-04 08:27:23.247858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.653 [2024-10-04 08:27:23.247877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.653 #30 NEW cov: 11870 ft: 14653 corp: 13/329b lim: 50 exec/s: 30 rss: 68Mb L: 28/44 MS: 1 PersAutoDict- DE: "\000h\340\213dST."- 00:09:30.653 [2024-10-04 08:27:23.318113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.654 [2024-10-04 08:27:23.318146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.654 [2024-10-04 08:27:23.318180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.654 [2024-10-04 08:27:23.318206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.654 [2024-10-04 08:27:23.318239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.654 [2024-10-04 08:27:23.318256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.654 [2024-10-04 08:27:23.318286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.654 [2024-10-04 08:27:23.318306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.913 #31 NEW cov: 11870 ft: 14665 corp: 14/373b lim: 50 exec/s: 31 rss: 68Mb L: 44/44 MS: 1 ChangeBinInt- 00:09:30.913 [2024-10-04 08:27:23.368148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.913 [2024-10-04 08:27:23.368178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.368218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.913 [2024-10-04 08:27:23.368235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.368264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.913 [2024-10-04 08:27:23.368281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.368308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.913 [2024-10-04 08:27:23.368323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.913 #32 NEW cov: 11870 ft: 14681 corp: 15/420b lim: 50 exec/s: 32 rss: 69Mb L: 47/47 MS: 1 CopyPart- 00:09:30.913 [2024-10-04 08:27:23.438272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.913 [2024-10-04 08:27:23.438303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.438336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.913 [2024-10-04 08:27:23.438353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.913 #33 NEW cov: 11870 ft: 14763 corp: 16/446b lim: 50 exec/s: 33 rss: 69Mb L: 26/47 MS: 1 EraseBytes- 00:09:30.913 [2024-10-04 08:27:23.508553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.913 [2024-10-04 08:27:23.508584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.508616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.913 [2024-10-04 08:27:23.508633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.508663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.913 [2024-10-04 08:27:23.508678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.508705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.913 [2024-10-04 08:27:23.508720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:30.913 #34 NEW cov: 11870 ft: 14787 corp: 17/490b lim: 50 exec/s: 34 rss: 69Mb L: 44/47 MS: 1 ChangeByte- 00:09:30.913 [2024-10-04 08:27:23.558694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:30.913 [2024-10-04 08:27:23.558726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.558757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:30.913 [2024-10-04 08:27:23.558774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.558807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:30.913 [2024-10-04 08:27:23.558822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.913 [2024-10-04 08:27:23.558850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:30.913 [2024-10-04 08:27:23.558865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.331 #35 NEW cov: 11870 ft: 14795 corp: 18/534b lim: 50 exec/s: 35 rss: 69Mb L: 44/47 MS: 1 ChangeBinInt- 00:09:31.331 [2024-10-04 08:27:23.628886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.331 [2024-10-04 08:27:23.628916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.628947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:31.331 [2024-10-04 08:27:23.628964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.628992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:31.331 [2024-10-04 08:27:23.629008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.629035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:31.331 [2024-10-04 08:27:23.629051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.331 #36 NEW cov: 11870 ft: 14824 corp: 19/581b lim: 50 exec/s: 36 rss: 69Mb L: 47/47 MS: 1 CrossOver- 00:09:31.331 [2024-10-04 08:27:23.699146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.331 [2024-10-04 08:27:23.699176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.699215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:31.331 [2024-10-04 08:27:23.699232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.699260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:31.331 [2024-10-04 08:27:23.699276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.699303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:31.331 [2024-10-04 08:27:23.699318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.331 #37 NEW cov: 11870 ft: 14840 corp: 20/625b lim: 50 exec/s: 37 rss: 69Mb L: 44/47 MS: 1 ChangeByte- 00:09:31.331 [2024-10-04 08:27:23.769299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.331 [2024-10-04 08:27:23.769331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.769365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:31.331 [2024-10-04 08:27:23.769383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.769413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:31.331 [2024-10-04 08:27:23.769434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.331 #38 NEW cov: 11870 ft: 15089 corp: 21/656b lim: 50 exec/s: 38 rss: 69Mb L: 31/47 MS: 1 InsertRepeatedBytes- 00:09:31.331 [2024-10-04 08:27:23.839466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.331 [2024-10-04 08:27:23.839497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.839528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:31.331 [2024-10-04 08:27:23.839544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.839573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:31.331 [2024-10-04 08:27:23.839589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.331 #39 NEW cov: 11870 ft: 15131 corp: 22/690b lim: 50 exec/s: 39 rss: 69Mb L: 34/47 MS: 1 CopyPart- 00:09:31.331 [2024-10-04 08:27:23.909582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.331 [2024-10-04 08:27:23.909615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.331 #40 NEW cov: 11870 ft: 15136 corp: 23/700b lim: 50 exec/s: 40 rss: 69Mb L: 10/47 MS: 1 CopyPart- 00:09:31.331 [2024-10-04 08:27:23.959885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.331 [2024-10-04 08:27:23.959920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.959955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:31.331 [2024-10-04 08:27:23.959973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.331 [2024-10-04 08:27:23.960004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:31.332 [2024-10-04 08:27:23.960021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.332 [2024-10-04 08:27:23.960052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:31.332 [2024-10-04 08:27:23.960069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.332 #41 NEW cov: 11870 ft: 15141 corp: 24/744b lim: 50 exec/s: 41 rss: 69Mb L: 44/47 MS: 1 ChangeByte- 00:09:31.332 [2024-10-04 08:27:24.009954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.332 [2024-10-04 08:27:24.009986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.332 [2024-10-04 08:27:24.010017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:31.332 [2024-10-04 08:27:24.010034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.332 [2024-10-04 08:27:24.010062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:31.332 [2024-10-04 08:27:24.010078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.332 [2024-10-04 08:27:24.010105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:31.332 [2024-10-04 08:27:24.010121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:31.592 #42 NEW cov: 11870 ft: 15167 corp: 25/788b lim: 50 exec/s: 42 rss: 69Mb L: 44/47 MS: 1 ChangeBit- 00:09:31.592 [2024-10-04 08:27:24.080088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.592 [2024-10-04 08:27:24.080120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.592 [2024-10-04 08:27:24.080153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:31.592 [2024-10-04 08:27:24.080170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.592 [2024-10-04 08:27:24.080206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:31.592 [2024-10-04 08:27:24.080239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:31.592 #43 NEW cov: 11870 ft: 15183 corp: 26/822b lim: 50 exec/s: 43 rss: 69Mb L: 34/47 MS: 1 ShuffleBytes- 00:09:31.592 [2024-10-04 08:27:24.150174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.592 [2024-10-04 08:27:24.150213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.592 #44 NEW cov: 11870 ft: 15226 corp: 27/833b lim: 50 exec/s: 44 rss: 69Mb L: 11/47 MS: 1 InsertByte- 00:09:31.592 [2024-10-04 08:27:24.210432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.592 [2024-10-04 08:27:24.210463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.592 [2024-10-04 08:27:24.210499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:31.592 [2024-10-04 08:27:24.210517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.592 #45 NEW cov: 11870 ft: 15239 corp: 28/859b lim: 50 exec/s: 45 rss: 70Mb L: 26/47 MS: 1 CopyPart- 00:09:31.852 [2024-10-04 08:27:24.280544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:31.852 [2024-10-04 08:27:24.280576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.852 #46 NEW cov: 11870 ft: 15245 corp: 29/871b lim: 50 exec/s: 23 rss: 70Mb L: 12/47 MS: 1 EraseBytes- 00:09:31.852 #46 DONE cov: 11870 ft: 15245 corp: 29/871b lim: 50 exec/s: 23 rss: 70Mb 00:09:31.852 ###### Recommended dictionary. ###### 00:09:31.852 "\000h\340\213dST." # Uses: 2 00:09:31.852 ###### End of recommended dictionary. ###### 00:09:31.852 Done 46 runs in 2 second(s) 00:09:31.852 08:27:24 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:09:31.852 08:27:24 -- ../common.sh@72 -- # (( i++ )) 00:09:31.852 08:27:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:31.852 08:27:24 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:09:31.852 08:27:24 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:09:31.852 08:27:24 -- nvmf/run.sh@24 -- # local timen=1 00:09:31.852 08:27:24 -- nvmf/run.sh@25 -- # local core=0x1 00:09:31.852 08:27:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:31.852 08:27:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:09:31.852 08:27:24 -- nvmf/run.sh@29 -- # printf %02d 22 00:09:31.852 08:27:24 -- nvmf/run.sh@29 -- # port=4422 00:09:31.852 08:27:24 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:31.852 08:27:24 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:09:31.852 08:27:24 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:31.852 08:27:24 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:09:31.852 [2024-10-04 08:27:24.471649] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:31.852 [2024-10-04 08:27:24.471717] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1022396 ] 00:09:31.852 EAL: No free 2048 kB hugepages reported on node 1 00:09:32.110 [2024-10-04 08:27:24.645632] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:32.110 [2024-10-04 08:27:24.664729] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:32.110 [2024-10-04 08:27:24.664849] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.110 [2024-10-04 08:27:24.716083] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:32.110 [2024-10-04 08:27:24.732450] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:09:32.110 INFO: Running with entropic power schedule (0xFF, 100). 00:09:32.110 INFO: Seed: 1391442661 00:09:32.110 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:32.110 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:32.110 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:32.110 INFO: A corpus is not provided, starting from an empty corpus 00:09:32.110 #2 INITED exec/s: 0 rss: 59Mb 00:09:32.110 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:32.110 This may also happen if the target rejected all inputs we tried so far 00:09:32.369 [2024-10-04 08:27:24.797669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.369 [2024-10-04 08:27:24.797700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.369 [2024-10-04 08:27:24.797761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.369 [2024-10-04 08:27:24.797779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.628 NEW_FUNC[1/672]: 0x478a98 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:09:32.628 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:32.628 #4 NEW cov: 11669 ft: 11670 corp: 2/48b lim: 85 exec/s: 0 rss: 66Mb L: 47/47 MS: 2 CopyPart-InsertRepeatedBytes- 00:09:32.628 [2024-10-04 08:27:25.108692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.628 [2024-10-04 08:27:25.108735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.628 [2024-10-04 08:27:25.108801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.628 [2024-10-04 08:27:25.108823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.628 [2024-10-04 08:27:25.108886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.628 [2024-10-04 08:27:25.108908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.628 [2024-10-04 08:27:25.108971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.628 [2024-10-04 08:27:25.108992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.628 #7 NEW cov: 11782 ft: 12638 corp: 3/130b lim: 85 exec/s: 0 rss: 66Mb L: 82/82 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:09:32.628 [2024-10-04 08:27:25.148543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.628 [2024-10-04 08:27:25.148571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.628 [2024-10-04 08:27:25.148606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.628 [2024-10-04 08:27:25.148621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.628 [2024-10-04 08:27:25.148674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.628 [2024-10-04 08:27:25.148689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.628 #8 NEW cov: 11788 ft: 13128 corp: 4/195b lim: 85 exec/s: 0 rss: 67Mb L: 65/82 MS: 1 CopyPart- 00:09:32.629 [2024-10-04 08:27:25.188536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.629 [2024-10-04 08:27:25.188567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.629 [2024-10-04 08:27:25.188619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.629 [2024-10-04 08:27:25.188634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.629 #9 NEW cov: 11873 ft: 13436 corp: 5/242b lim: 85 exec/s: 0 rss: 67Mb L: 47/82 MS: 1 CopyPart- 00:09:32.629 [2024-10-04 08:27:25.228890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.629 [2024-10-04 08:27:25.228919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.629 [2024-10-04 08:27:25.228956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.629 [2024-10-04 08:27:25.228971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.629 [2024-10-04 08:27:25.229021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.629 [2024-10-04 08:27:25.229036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.629 [2024-10-04 08:27:25.229087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.629 [2024-10-04 08:27:25.229102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.629 #10 NEW cov: 11873 ft: 13539 corp: 6/324b lim: 85 exec/s: 0 rss: 67Mb L: 82/82 MS: 1 ChangeBinInt- 00:09:32.629 [2024-10-04 08:27:25.268745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.629 [2024-10-04 08:27:25.268773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.629 [2024-10-04 08:27:25.268823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.629 [2024-10-04 08:27:25.268838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.629 #11 NEW cov: 11873 ft: 13623 corp: 7/371b lim: 85 exec/s: 0 rss: 67Mb L: 47/82 MS: 1 CopyPart- 00:09:32.629 [2024-10-04 08:27:25.309219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.629 [2024-10-04 08:27:25.309248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.629 [2024-10-04 08:27:25.309291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.629 [2024-10-04 08:27:25.309310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.629 [2024-10-04 08:27:25.309364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.629 [2024-10-04 08:27:25.309380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.629 [2024-10-04 08:27:25.309432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.629 [2024-10-04 08:27:25.309447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.888 #12 NEW cov: 11873 ft: 13682 corp: 8/454b lim: 85 exec/s: 0 rss: 67Mb L: 83/83 MS: 1 InsertByte- 00:09:32.888 [2024-10-04 08:27:25.349133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.888 [2024-10-04 08:27:25.349163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.349198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.888 [2024-10-04 08:27:25.349215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.349267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.888 [2024-10-04 08:27:25.349283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.888 #14 NEW cov: 11873 ft: 13776 corp: 9/518b lim: 85 exec/s: 0 rss: 67Mb L: 64/83 MS: 2 ShuffleBytes-CrossOver- 00:09:32.888 [2024-10-04 08:27:25.389377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.888 [2024-10-04 08:27:25.389406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.389445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.888 [2024-10-04 08:27:25.389460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.389512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.888 [2024-10-04 08:27:25.389528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.389578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.888 [2024-10-04 08:27:25.389593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.888 #20 NEW cov: 11873 ft: 13799 corp: 10/600b lim: 85 exec/s: 0 rss: 67Mb L: 82/83 MS: 1 ChangeBinInt- 00:09:32.888 [2024-10-04 08:27:25.429509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.888 [2024-10-04 08:27:25.429536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.429572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.888 [2024-10-04 08:27:25.429585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.429637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.888 [2024-10-04 08:27:25.429652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.429702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.888 [2024-10-04 08:27:25.429720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.888 #21 NEW cov: 11873 ft: 13861 corp: 11/682b lim: 85 exec/s: 0 rss: 67Mb L: 82/83 MS: 1 ChangeBit- 00:09:32.888 [2024-10-04 08:27:25.469630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.888 [2024-10-04 08:27:25.469658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.469699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.888 [2024-10-04 08:27:25.469715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.469767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:32.888 [2024-10-04 08:27:25.469782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.469832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:32.888 [2024-10-04 08:27:25.469847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.888 #22 NEW cov: 11873 ft: 13914 corp: 12/765b lim: 85 exec/s: 0 rss: 67Mb L: 83/83 MS: 1 ChangeByte- 00:09:32.888 [2024-10-04 08:27:25.509451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.888 [2024-10-04 08:27:25.509478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.888 [2024-10-04 08:27:25.509511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:32.888 [2024-10-04 08:27:25.509526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.888 #23 NEW cov: 11873 ft: 13943 corp: 13/814b lim: 85 exec/s: 0 rss: 67Mb L: 49/83 MS: 1 EraseBytes- 00:09:32.888 [2024-10-04 08:27:25.549461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:32.888 [2024-10-04 08:27:25.549489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.147 #24 NEW cov: 11873 ft: 14746 corp: 14/847b lim: 85 exec/s: 0 rss: 67Mb L: 33/83 MS: 1 EraseBytes- 00:09:33.147 [2024-10-04 08:27:25.599880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.147 [2024-10-04 08:27:25.599906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.599944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.147 [2024-10-04 08:27:25.599958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.600008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.147 [2024-10-04 08:27:25.600024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.147 #25 NEW cov: 11873 ft: 14773 corp: 15/912b lim: 85 exec/s: 0 rss: 67Mb L: 65/83 MS: 1 InsertByte- 00:09:33.147 [2024-10-04 08:27:25.639683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.147 [2024-10-04 08:27:25.639710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.147 #26 NEW cov: 11873 ft: 14814 corp: 16/938b lim: 85 exec/s: 0 rss: 67Mb L: 26/83 MS: 1 EraseBytes- 00:09:33.147 [2024-10-04 08:27:25.680284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.147 [2024-10-04 08:27:25.680311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.680351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.147 [2024-10-04 08:27:25.680366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.680416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.147 [2024-10-04 08:27:25.680432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.680483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.147 [2024-10-04 08:27:25.680498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.147 #27 NEW cov: 11873 ft: 14833 corp: 17/1021b lim: 85 exec/s: 0 rss: 67Mb L: 83/83 MS: 1 CopyPart- 00:09:33.147 [2024-10-04 08:27:25.720375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.147 [2024-10-04 08:27:25.720402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.720442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.147 [2024-10-04 08:27:25.720455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.720508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.147 [2024-10-04 08:27:25.720524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.720577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.147 [2024-10-04 08:27:25.720592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.147 #28 NEW cov: 11873 ft: 14851 corp: 18/1103b lim: 85 exec/s: 0 rss: 67Mb L: 82/83 MS: 1 ShuffleBytes- 00:09:33.147 [2024-10-04 08:27:25.760314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.147 [2024-10-04 08:27:25.760341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.760377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.147 [2024-10-04 08:27:25.760392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.760444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.147 [2024-10-04 08:27:25.760460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.147 #29 NEW cov: 11873 ft: 14880 corp: 19/1170b lim: 85 exec/s: 29 rss: 67Mb L: 67/83 MS: 1 InsertRepeatedBytes- 00:09:33.147 [2024-10-04 08:27:25.800423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.147 [2024-10-04 08:27:25.800449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.800484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.147 [2024-10-04 08:27:25.800499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.147 [2024-10-04 08:27:25.800555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.147 [2024-10-04 08:27:25.800570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.147 #30 NEW cov: 11873 ft: 14908 corp: 20/1235b lim: 85 exec/s: 30 rss: 67Mb L: 65/83 MS: 1 InsertRepeatedBytes- 00:09:33.407 [2024-10-04 08:27:25.840676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.407 [2024-10-04 08:27:25.840704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:25.840750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.407 [2024-10-04 08:27:25.840765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:25.840815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.407 [2024-10-04 08:27:25.840831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:25.840880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.407 [2024-10-04 08:27:25.840894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.407 #31 NEW cov: 11873 ft: 14921 corp: 21/1317b lim: 85 exec/s: 31 rss: 67Mb L: 82/83 MS: 1 EraseBytes- 00:09:33.407 [2024-10-04 08:27:25.880831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.407 [2024-10-04 08:27:25.880858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:25.880894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.407 [2024-10-04 08:27:25.880910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:25.880958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.407 [2024-10-04 08:27:25.880974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:25.881027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.407 [2024-10-04 08:27:25.881042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.407 #32 NEW cov: 11873 ft: 14949 corp: 22/1400b lim: 85 exec/s: 32 rss: 68Mb L: 83/83 MS: 1 ShuffleBytes- 00:09:33.407 [2024-10-04 08:27:25.920669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.407 [2024-10-04 08:27:25.920696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:25.920730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.407 [2024-10-04 08:27:25.920744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.407 #33 NEW cov: 11873 ft: 15001 corp: 23/1436b lim: 85 exec/s: 33 rss: 68Mb L: 36/83 MS: 1 EraseBytes- 00:09:33.407 [2024-10-04 08:27:25.960631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.407 [2024-10-04 08:27:25.960657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.407 #34 NEW cov: 11873 ft: 15073 corp: 24/1463b lim: 85 exec/s: 34 rss: 68Mb L: 27/83 MS: 1 EraseBytes- 00:09:33.407 [2024-10-04 08:27:26.000869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.407 [2024-10-04 08:27:26.000896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:26.000933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.407 [2024-10-04 08:27:26.000948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.407 #35 NEW cov: 11873 ft: 15108 corp: 25/1510b lim: 85 exec/s: 35 rss: 68Mb L: 47/83 MS: 1 ChangeBit- 00:09:33.407 [2024-10-04 08:27:26.041145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.407 [2024-10-04 08:27:26.041172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:26.041212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.407 [2024-10-04 08:27:26.041228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:26.041281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.407 [2024-10-04 08:27:26.041296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.407 #36 NEW cov: 11873 ft: 15133 corp: 26/1574b lim: 85 exec/s: 36 rss: 68Mb L: 64/83 MS: 1 CrossOver- 00:09:33.407 [2024-10-04 08:27:26.081401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.407 [2024-10-04 08:27:26.081427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:26.081464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.407 [2024-10-04 08:27:26.081479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:26.081530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.407 [2024-10-04 08:27:26.081545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.407 [2024-10-04 08:27:26.081598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.407 [2024-10-04 08:27:26.081612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.666 #37 NEW cov: 11873 ft: 15156 corp: 27/1656b lim: 85 exec/s: 37 rss: 68Mb L: 82/83 MS: 1 ShuffleBytes- 00:09:33.666 [2024-10-04 08:27:26.121543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.666 [2024-10-04 08:27:26.121570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.121616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.666 [2024-10-04 08:27:26.121630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.121677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.666 [2024-10-04 08:27:26.121692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.121743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.666 [2024-10-04 08:27:26.121758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.666 #38 NEW cov: 11873 ft: 15234 corp: 28/1740b lim: 85 exec/s: 38 rss: 68Mb L: 84/84 MS: 1 InsertByte- 00:09:33.666 [2024-10-04 08:27:26.161351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.666 [2024-10-04 08:27:26.161378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.161412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.666 [2024-10-04 08:27:26.161427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.666 #39 NEW cov: 11873 ft: 15331 corp: 29/1787b lim: 85 exec/s: 39 rss: 68Mb L: 47/84 MS: 1 CrossOver- 00:09:33.666 [2024-10-04 08:27:26.201591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.666 [2024-10-04 08:27:26.201618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.201655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.666 [2024-10-04 08:27:26.201670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.201720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.666 [2024-10-04 08:27:26.201736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.666 #40 NEW cov: 11873 ft: 15346 corp: 30/1851b lim: 85 exec/s: 40 rss: 68Mb L: 64/84 MS: 1 ChangeBit- 00:09:33.666 [2024-10-04 08:27:26.241878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.666 [2024-10-04 08:27:26.241905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.241953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.666 [2024-10-04 08:27:26.241967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.242016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.666 [2024-10-04 08:27:26.242031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.242081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.666 [2024-10-04 08:27:26.242096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.666 #41 NEW cov: 11873 ft: 15358 corp: 31/1934b lim: 85 exec/s: 41 rss: 68Mb L: 83/84 MS: 1 ShuffleBytes- 00:09:33.666 [2024-10-04 08:27:26.281854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.666 [2024-10-04 08:27:26.281881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.281921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.666 [2024-10-04 08:27:26.281936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.281985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.666 [2024-10-04 08:27:26.282000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.666 #42 NEW cov: 11873 ft: 15367 corp: 32/1997b lim: 85 exec/s: 42 rss: 68Mb L: 63/84 MS: 1 InsertRepeatedBytes- 00:09:33.666 [2024-10-04 08:27:26.321975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.666 [2024-10-04 08:27:26.322003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.322039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.666 [2024-10-04 08:27:26.322053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.666 [2024-10-04 08:27:26.322103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.666 [2024-10-04 08:27:26.322119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.925 #43 NEW cov: 11873 ft: 15383 corp: 33/2062b lim: 85 exec/s: 43 rss: 68Mb L: 65/84 MS: 1 ShuffleBytes- 00:09:33.925 [2024-10-04 08:27:26.362224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.925 [2024-10-04 08:27:26.362253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.362294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.926 [2024-10-04 08:27:26.362311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.362360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.926 [2024-10-04 08:27:26.362378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.362430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.926 [2024-10-04 08:27:26.362445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.926 #44 NEW cov: 11873 ft: 15397 corp: 34/2145b lim: 85 exec/s: 44 rss: 68Mb L: 83/84 MS: 1 ChangeBit- 00:09:33.926 [2024-10-04 08:27:26.402301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.926 [2024-10-04 08:27:26.402329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.402375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.926 [2024-10-04 08:27:26.402390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.402437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.926 [2024-10-04 08:27:26.402453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.402503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.926 [2024-10-04 08:27:26.402517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.926 #45 NEW cov: 11873 ft: 15405 corp: 35/2228b lim: 85 exec/s: 45 rss: 68Mb L: 83/84 MS: 1 ChangeBit- 00:09:33.926 [2024-10-04 08:27:26.442315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.926 [2024-10-04 08:27:26.442343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.442379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.926 [2024-10-04 08:27:26.442394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.442449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.926 [2024-10-04 08:27:26.442465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.926 #46 NEW cov: 11873 ft: 15426 corp: 36/2295b lim: 85 exec/s: 46 rss: 68Mb L: 67/84 MS: 1 InsertRepeatedBytes- 00:09:33.926 [2024-10-04 08:27:26.482608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.926 [2024-10-04 08:27:26.482637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.482675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.926 [2024-10-04 08:27:26.482690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.482741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.926 [2024-10-04 08:27:26.482756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.482808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.926 [2024-10-04 08:27:26.482825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.926 #47 NEW cov: 11873 ft: 15428 corp: 37/2377b lim: 85 exec/s: 47 rss: 68Mb L: 82/84 MS: 1 ChangeBit- 00:09:33.926 [2024-10-04 08:27:26.522654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.926 [2024-10-04 08:27:26.522682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.522730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.926 [2024-10-04 08:27:26.522744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.522796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.926 [2024-10-04 08:27:26.522811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.522863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.926 [2024-10-04 08:27:26.522877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.926 #48 NEW cov: 11873 ft: 15448 corp: 38/2461b lim: 85 exec/s: 48 rss: 68Mb L: 84/84 MS: 1 InsertByte- 00:09:33.926 [2024-10-04 08:27:26.562517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.926 [2024-10-04 08:27:26.562546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.562596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.926 [2024-10-04 08:27:26.562612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.926 #51 NEW cov: 11873 ft: 15453 corp: 39/2506b lim: 85 exec/s: 51 rss: 68Mb L: 45/84 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:09:33.926 [2024-10-04 08:27:26.592874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:33.926 [2024-10-04 08:27:26.592902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.592937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:33.926 [2024-10-04 08:27:26.592953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.593004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:33.926 [2024-10-04 08:27:26.593019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.926 [2024-10-04 08:27:26.593068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:33.926 [2024-10-04 08:27:26.593083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.185 #52 NEW cov: 11873 ft: 15473 corp: 40/2590b lim: 85 exec/s: 52 rss: 68Mb L: 84/84 MS: 1 InsertByte- 00:09:34.185 [2024-10-04 08:27:26.632576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:34.185 [2024-10-04 08:27:26.632603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.185 #53 NEW cov: 11873 ft: 15479 corp: 41/2617b lim: 85 exec/s: 53 rss: 68Mb L: 27/84 MS: 1 EraseBytes- 00:09:34.185 [2024-10-04 08:27:26.672852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:34.185 [2024-10-04 08:27:26.672879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.185 [2024-10-04 08:27:26.672918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:34.185 [2024-10-04 08:27:26.672933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.185 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:34.185 #54 NEW cov: 11896 ft: 15522 corp: 42/2660b lim: 85 exec/s: 54 rss: 68Mb L: 43/84 MS: 1 EraseBytes- 00:09:34.185 [2024-10-04 08:27:26.723248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:34.185 [2024-10-04 08:27:26.723276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.185 [2024-10-04 08:27:26.723317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:34.185 [2024-10-04 08:27:26.723332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.185 [2024-10-04 08:27:26.723382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:34.185 [2024-10-04 08:27:26.723397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.185 [2024-10-04 08:27:26.723448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:34.185 [2024-10-04 08:27:26.723463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.185 #55 NEW cov: 11896 ft: 15597 corp: 43/2729b lim: 85 exec/s: 55 rss: 69Mb L: 69/84 MS: 1 InsertRepeatedBytes- 00:09:34.185 [2024-10-04 08:27:26.763251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:34.185 [2024-10-04 08:27:26.763278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.185 [2024-10-04 08:27:26.763321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:34.185 [2024-10-04 08:27:26.763335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.185 [2024-10-04 08:27:26.763389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:34.185 [2024-10-04 08:27:26.763403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.185 #56 NEW cov: 11896 ft: 15631 corp: 44/2780b lim: 85 exec/s: 28 rss: 69Mb L: 51/84 MS: 1 CrossOver- 00:09:34.185 #56 DONE cov: 11896 ft: 15631 corp: 44/2780b lim: 85 exec/s: 28 rss: 69Mb 00:09:34.185 Done 56 runs in 2 second(s) 00:09:34.444 08:27:26 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:09:34.444 08:27:26 -- ../common.sh@72 -- # (( i++ )) 00:09:34.444 08:27:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:34.444 08:27:26 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:09:34.444 08:27:26 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:09:34.444 08:27:26 -- nvmf/run.sh@24 -- # local timen=1 00:09:34.444 08:27:26 -- nvmf/run.sh@25 -- # local core=0x1 00:09:34.444 08:27:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:34.444 08:27:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:09:34.444 08:27:26 -- nvmf/run.sh@29 -- # printf %02d 23 00:09:34.444 08:27:26 -- nvmf/run.sh@29 -- # port=4423 00:09:34.444 08:27:26 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:34.444 08:27:26 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:09:34.444 08:27:26 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:34.444 08:27:26 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:09:34.444 [2024-10-04 08:27:26.936400] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:34.444 [2024-10-04 08:27:26.936467] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1022930 ] 00:09:34.444 EAL: No free 2048 kB hugepages reported on node 1 00:09:34.444 [2024-10-04 08:27:27.109177] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:34.703 [2024-10-04 08:27:27.128319] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:34.703 [2024-10-04 08:27:27.128446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:34.703 [2024-10-04 08:27:27.179732] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:34.703 [2024-10-04 08:27:27.196088] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:09:34.703 INFO: Running with entropic power schedule (0xFF, 100). 00:09:34.703 INFO: Seed: 3853984123 00:09:34.703 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:34.703 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:34.703 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:34.703 INFO: A corpus is not provided, starting from an empty corpus 00:09:34.703 #2 INITED exec/s: 0 rss: 60Mb 00:09:34.703 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:34.703 This may also happen if the target rejected all inputs we tried so far 00:09:34.703 [2024-10-04 08:27:27.251076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.703 [2024-10-04 08:27:27.251109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.962 NEW_FUNC[1/671]: 0x47bcd8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:09:34.962 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:34.962 #6 NEW cov: 11602 ft: 11603 corp: 2/6b lim: 25 exec/s: 0 rss: 67Mb L: 5/5 MS: 4 CopyPart-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:09:34.962 [2024-10-04 08:27:27.541825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.962 [2024-10-04 08:27:27.541867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.962 #7 NEW cov: 11715 ft: 12168 corp: 3/15b lim: 25 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:09:34.962 [2024-10-04 08:27:27.582093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.962 [2024-10-04 08:27:27.582121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.962 [2024-10-04 08:27:27.582159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:34.962 [2024-10-04 08:27:27.582175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.962 [2024-10-04 08:27:27.582232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:34.962 [2024-10-04 08:27:27.582248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.962 #13 NEW cov: 11721 ft: 12917 corp: 4/33b lim: 25 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:09:34.962 [2024-10-04 08:27:27.621985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:34.962 [2024-10-04 08:27:27.622013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.220 #14 NEW cov: 11806 ft: 13291 corp: 5/42b lim: 25 exec/s: 0 rss: 67Mb L: 9/18 MS: 1 ChangeBit- 00:09:35.220 [2024-10-04 08:27:27.662056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.220 [2024-10-04 08:27:27.662083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.220 #15 NEW cov: 11806 ft: 13371 corp: 6/47b lim: 25 exec/s: 0 rss: 67Mb L: 5/18 MS: 1 ChangeBit- 00:09:35.220 [2024-10-04 08:27:27.702190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.220 [2024-10-04 08:27:27.702219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.220 #16 NEW cov: 11806 ft: 13485 corp: 7/52b lim: 25 exec/s: 0 rss: 68Mb L: 5/18 MS: 1 ChangeByte- 00:09:35.220 [2024-10-04 08:27:27.742540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.220 [2024-10-04 08:27:27.742568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.220 [2024-10-04 08:27:27.742607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.220 [2024-10-04 08:27:27.742624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.220 [2024-10-04 08:27:27.742676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.220 [2024-10-04 08:27:27.742691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.220 #17 NEW cov: 11806 ft: 13597 corp: 8/70b lim: 25 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 ShuffleBytes- 00:09:35.221 [2024-10-04 08:27:27.782641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.221 [2024-10-04 08:27:27.782668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.221 [2024-10-04 08:27:27.782704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.221 [2024-10-04 08:27:27.782719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.221 [2024-10-04 08:27:27.782772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.221 [2024-10-04 08:27:27.782787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.221 #18 NEW cov: 11806 ft: 13626 corp: 9/88b lim: 25 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 ShuffleBytes- 00:09:35.221 [2024-10-04 08:27:27.822551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.221 [2024-10-04 08:27:27.822578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.221 #19 NEW cov: 11806 ft: 13652 corp: 10/97b lim: 25 exec/s: 0 rss: 68Mb L: 9/18 MS: 1 ChangeByte- 00:09:35.221 [2024-10-04 08:27:27.862632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.221 [2024-10-04 08:27:27.862658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.221 #20 NEW cov: 11806 ft: 13694 corp: 11/106b lim: 25 exec/s: 0 rss: 68Mb L: 9/18 MS: 1 ChangeByte- 00:09:35.479 [2024-10-04 08:27:27.903059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.479 [2024-10-04 08:27:27.903087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.479 [2024-10-04 08:27:27.903129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.479 [2024-10-04 08:27:27.903145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.479 [2024-10-04 08:27:27.903205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.480 [2024-10-04 08:27:27.903221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.480 #21 NEW cov: 11806 ft: 13727 corp: 12/125b lim: 25 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CrossOver- 00:09:35.480 [2024-10-04 08:27:27.942881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.480 [2024-10-04 08:27:27.942910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.480 #22 NEW cov: 11806 ft: 13767 corp: 13/134b lim: 25 exec/s: 0 rss: 68Mb L: 9/19 MS: 1 ChangeBinInt- 00:09:35.480 [2024-10-04 08:27:27.973060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.480 [2024-10-04 08:27:27.973089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.480 #23 NEW cov: 11806 ft: 13824 corp: 14/140b lim: 25 exec/s: 0 rss: 68Mb L: 6/19 MS: 1 CrossOver- 00:09:35.480 [2024-10-04 08:27:28.013356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.480 [2024-10-04 08:27:28.013385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.480 [2024-10-04 08:27:28.013424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.480 [2024-10-04 08:27:28.013439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.480 [2024-10-04 08:27:28.013495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.480 [2024-10-04 08:27:28.013511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.480 #24 NEW cov: 11806 ft: 13869 corp: 15/159b lim: 25 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 ChangeBit- 00:09:35.480 [2024-10-04 08:27:28.053216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.480 [2024-10-04 08:27:28.053244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.480 #25 NEW cov: 11806 ft: 13894 corp: 16/164b lim: 25 exec/s: 0 rss: 68Mb L: 5/19 MS: 1 ChangeByte- 00:09:35.480 [2024-10-04 08:27:28.093360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.480 [2024-10-04 08:27:28.093387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.480 #26 NEW cov: 11806 ft: 13967 corp: 17/173b lim: 25 exec/s: 0 rss: 68Mb L: 9/19 MS: 1 CopyPart- 00:09:35.480 [2024-10-04 08:27:28.133454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.480 [2024-10-04 08:27:28.133481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.739 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:35.739 #27 NEW cov: 11829 ft: 14042 corp: 18/178b lim: 25 exec/s: 0 rss: 68Mb L: 5/19 MS: 1 CopyPart- 00:09:35.739 [2024-10-04 08:27:28.173592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.739 [2024-10-04 08:27:28.173619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.739 #28 NEW cov: 11829 ft: 14057 corp: 19/187b lim: 25 exec/s: 0 rss: 68Mb L: 9/19 MS: 1 ChangeBinInt- 00:09:35.739 [2024-10-04 08:27:28.203813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.739 [2024-10-04 08:27:28.203840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.739 [2024-10-04 08:27:28.203883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.739 [2024-10-04 08:27:28.203899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.739 #29 NEW cov: 11829 ft: 14283 corp: 20/199b lim: 25 exec/s: 29 rss: 68Mb L: 12/19 MS: 1 InsertRepeatedBytes- 00:09:35.739 [2024-10-04 08:27:28.243786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.739 [2024-10-04 08:27:28.243814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.739 #30 NEW cov: 11829 ft: 14290 corp: 21/204b lim: 25 exec/s: 30 rss: 68Mb L: 5/19 MS: 1 EraseBytes- 00:09:35.739 [2024-10-04 08:27:28.283898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.739 [2024-10-04 08:27:28.283925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.739 #36 NEW cov: 11829 ft: 14296 corp: 22/213b lim: 25 exec/s: 36 rss: 68Mb L: 9/19 MS: 1 ChangeByte- 00:09:35.739 [2024-10-04 08:27:28.324012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.739 [2024-10-04 08:27:28.324039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.739 #37 NEW cov: 11829 ft: 14304 corp: 23/222b lim: 25 exec/s: 37 rss: 68Mb L: 9/19 MS: 1 ShuffleBytes- 00:09:35.739 [2024-10-04 08:27:28.364112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.739 [2024-10-04 08:27:28.364140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.739 #38 NEW cov: 11829 ft: 14336 corp: 24/227b lim: 25 exec/s: 38 rss: 69Mb L: 5/19 MS: 1 ShuffleBytes- 00:09:35.739 [2024-10-04 08:27:28.404655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.739 [2024-10-04 08:27:28.404684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.739 [2024-10-04 08:27:28.404730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.739 [2024-10-04 08:27:28.404746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.739 [2024-10-04 08:27:28.404798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.739 [2024-10-04 08:27:28.404812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.739 [2024-10-04 08:27:28.404865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:35.739 [2024-10-04 08:27:28.404882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:35.998 #39 NEW cov: 11829 ft: 14770 corp: 25/249b lim: 25 exec/s: 39 rss: 69Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:09:35.999 [2024-10-04 08:27:28.444390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.999 [2024-10-04 08:27:28.444417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.999 #40 NEW cov: 11829 ft: 14773 corp: 26/254b lim: 25 exec/s: 40 rss: 69Mb L: 5/22 MS: 1 EraseBytes- 00:09:35.999 [2024-10-04 08:27:28.484501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.999 [2024-10-04 08:27:28.484528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.999 #41 NEW cov: 11829 ft: 14812 corp: 27/263b lim: 25 exec/s: 41 rss: 69Mb L: 9/22 MS: 1 ChangeBinInt- 00:09:35.999 [2024-10-04 08:27:28.514581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.999 [2024-10-04 08:27:28.514608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.999 #42 NEW cov: 11829 ft: 14820 corp: 28/268b lim: 25 exec/s: 42 rss: 69Mb L: 5/22 MS: 1 ChangeByte- 00:09:35.999 [2024-10-04 08:27:28.554974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.999 [2024-10-04 08:27:28.555001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.999 [2024-10-04 08:27:28.555037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.999 [2024-10-04 08:27:28.555052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.999 [2024-10-04 08:27:28.555106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.999 [2024-10-04 08:27:28.555122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.999 #43 NEW cov: 11829 ft: 14825 corp: 29/286b lim: 25 exec/s: 43 rss: 69Mb L: 18/22 MS: 1 CrossOver- 00:09:35.999 [2024-10-04 08:27:28.594864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.999 [2024-10-04 08:27:28.594891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.999 #44 NEW cov: 11829 ft: 14834 corp: 30/295b lim: 25 exec/s: 44 rss: 69Mb L: 9/22 MS: 1 ChangeBit- 00:09:35.999 [2024-10-04 08:27:28.635202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.999 [2024-10-04 08:27:28.635230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.999 [2024-10-04 08:27:28.635270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:35.999 [2024-10-04 08:27:28.635285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:35.999 [2024-10-04 08:27:28.635339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:35.999 [2024-10-04 08:27:28.635354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:35.999 #46 NEW cov: 11829 ft: 14838 corp: 31/313b lim: 25 exec/s: 46 rss: 69Mb L: 18/22 MS: 2 EraseBytes-InsertRepeatedBytes- 00:09:35.999 [2024-10-04 08:27:28.675132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:35.999 [2024-10-04 08:27:28.675160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.258 #47 NEW cov: 11829 ft: 14839 corp: 32/319b lim: 25 exec/s: 47 rss: 69Mb L: 6/22 MS: 1 CopyPart- 00:09:36.258 [2024-10-04 08:27:28.705204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.258 [2024-10-04 08:27:28.705230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.258 #48 NEW cov: 11829 ft: 14958 corp: 33/328b lim: 25 exec/s: 48 rss: 69Mb L: 9/22 MS: 1 ShuffleBytes- 00:09:36.258 [2024-10-04 08:27:28.745642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.258 [2024-10-04 08:27:28.745670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.258 [2024-10-04 08:27:28.745719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:36.258 [2024-10-04 08:27:28.745735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.258 [2024-10-04 08:27:28.745790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:36.258 [2024-10-04 08:27:28.745805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.258 [2024-10-04 08:27:28.745862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:36.258 [2024-10-04 08:27:28.745877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:36.258 #49 NEW cov: 11829 ft: 14969 corp: 34/349b lim: 25 exec/s: 49 rss: 69Mb L: 21/22 MS: 1 InsertRepeatedBytes- 00:09:36.258 [2024-10-04 08:27:28.785396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.258 [2024-10-04 08:27:28.785425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.258 #50 NEW cov: 11829 ft: 14990 corp: 35/355b lim: 25 exec/s: 50 rss: 69Mb L: 6/22 MS: 1 ShuffleBytes- 00:09:36.258 [2024-10-04 08:27:28.815509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.258 [2024-10-04 08:27:28.815537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.258 #51 NEW cov: 11829 ft: 15048 corp: 36/364b lim: 25 exec/s: 51 rss: 69Mb L: 9/22 MS: 1 ShuffleBytes- 00:09:36.258 [2024-10-04 08:27:28.855618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.258 [2024-10-04 08:27:28.855646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.258 #52 NEW cov: 11829 ft: 15063 corp: 37/369b lim: 25 exec/s: 52 rss: 69Mb L: 5/22 MS: 1 ShuffleBytes- 00:09:36.258 [2024-10-04 08:27:28.895957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.258 [2024-10-04 08:27:28.895986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.258 [2024-10-04 08:27:28.896025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:36.258 [2024-10-04 08:27:28.896040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.258 [2024-10-04 08:27:28.896090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:36.258 [2024-10-04 08:27:28.896105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.258 #53 NEW cov: 11829 ft: 15079 corp: 38/387b lim: 25 exec/s: 53 rss: 69Mb L: 18/22 MS: 1 CrossOver- 00:09:36.258 [2024-10-04 08:27:28.936095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.258 [2024-10-04 08:27:28.936122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.258 [2024-10-04 08:27:28.936160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:36.258 [2024-10-04 08:27:28.936175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.258 [2024-10-04 08:27:28.936237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:36.258 [2024-10-04 08:27:28.936254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.518 #54 NEW cov: 11829 ft: 15085 corp: 39/406b lim: 25 exec/s: 54 rss: 69Mb L: 19/22 MS: 1 InsertRepeatedBytes- 00:09:36.518 [2024-10-04 08:27:28.976081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.518 [2024-10-04 08:27:28.976108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.518 [2024-10-04 08:27:28.976147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:36.518 [2024-10-04 08:27:28.976163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.518 #55 NEW cov: 11829 ft: 15090 corp: 40/418b lim: 25 exec/s: 55 rss: 69Mb L: 12/22 MS: 1 CrossOver- 00:09:36.518 [2024-10-04 08:27:29.016095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.518 [2024-10-04 08:27:29.016122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.518 #56 NEW cov: 11829 ft: 15098 corp: 41/426b lim: 25 exec/s: 56 rss: 69Mb L: 8/22 MS: 1 CopyPart- 00:09:36.518 [2024-10-04 08:27:29.046556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.518 [2024-10-04 08:27:29.046584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.518 [2024-10-04 08:27:29.046626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:36.518 [2024-10-04 08:27:29.046641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.518 [2024-10-04 08:27:29.046696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:36.518 [2024-10-04 08:27:29.046711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.518 [2024-10-04 08:27:29.046766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:36.518 [2024-10-04 08:27:29.046782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:36.518 #57 NEW cov: 11829 ft: 15118 corp: 42/449b lim: 25 exec/s: 57 rss: 69Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:09:36.518 [2024-10-04 08:27:29.086292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.518 [2024-10-04 08:27:29.086319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.518 #60 NEW cov: 11829 ft: 15127 corp: 43/457b lim: 25 exec/s: 60 rss: 69Mb L: 8/23 MS: 3 ChangeBit-ShuffleBytes-CrossOver- 00:09:36.518 [2024-10-04 08:27:29.116360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.518 [2024-10-04 08:27:29.116387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.518 #61 NEW cov: 11829 ft: 15143 corp: 44/463b lim: 25 exec/s: 61 rss: 69Mb L: 6/23 MS: 1 ChangeBinInt- 00:09:36.518 [2024-10-04 08:27:29.156489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.518 [2024-10-04 08:27:29.156515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.518 #62 NEW cov: 11829 ft: 15156 corp: 45/471b lim: 25 exec/s: 62 rss: 70Mb L: 8/23 MS: 1 ChangeByte- 00:09:36.518 [2024-10-04 08:27:29.196639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.518 [2024-10-04 08:27:29.196666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.778 [2024-10-04 08:27:29.236754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:36.778 [2024-10-04 08:27:29.236781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.778 #64 pulse cov: 11829 ft: 15168 corp: 45/471b lim: 25 exec/s: 32 rss: 70Mb 00:09:36.778 #64 NEW cov: 11829 ft: 15168 corp: 46/480b lim: 25 exec/s: 32 rss: 70Mb L: 9/23 MS: 2 ChangeByte-ChangeBinInt- 00:09:36.778 #64 DONE cov: 11829 ft: 15168 corp: 46/480b lim: 25 exec/s: 32 rss: 70Mb 00:09:36.778 Done 64 runs in 2 second(s) 00:09:36.778 08:27:29 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:09:36.778 08:27:29 -- ../common.sh@72 -- # (( i++ )) 00:09:36.778 08:27:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:36.778 08:27:29 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:09:36.778 08:27:29 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:09:36.778 08:27:29 -- nvmf/run.sh@24 -- # local timen=1 00:09:36.778 08:27:29 -- nvmf/run.sh@25 -- # local core=0x1 00:09:36.778 08:27:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:36.779 08:27:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:09:36.779 08:27:29 -- nvmf/run.sh@29 -- # printf %02d 24 00:09:36.779 08:27:29 -- nvmf/run.sh@29 -- # port=4424 00:09:36.779 08:27:29 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:36.779 08:27:29 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:09:36.779 08:27:29 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:36.779 08:27:29 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:09:36.779 [2024-10-04 08:27:29.411926] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:36.779 [2024-10-04 08:27:29.412021] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1023220 ] 00:09:36.779 EAL: No free 2048 kB hugepages reported on node 1 00:09:37.039 [2024-10-04 08:27:29.590787] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:37.039 [2024-10-04 08:27:29.610278] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:37.039 [2024-10-04 08:27:29.610400] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:37.039 [2024-10-04 08:27:29.661652] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:37.039 [2024-10-04 08:27:29.678055] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:09:37.039 INFO: Running with entropic power schedule (0xFF, 100). 00:09:37.039 INFO: Seed: 2040010180 00:09:37.039 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:09:37.039 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:09:37.039 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:37.039 INFO: A corpus is not provided, starting from an empty corpus 00:09:37.039 #2 INITED exec/s: 0 rss: 59Mb 00:09:37.039 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:37.039 This may also happen if the target rejected all inputs we tried so far 00:09:37.299 [2024-10-04 08:27:29.723075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.299 [2024-10-04 08:27:29.723107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.558 NEW_FUNC[1/672]: 0x47cdc8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:09:37.558 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:37.558 #15 NEW cov: 11674 ft: 11673 corp: 2/39b lim: 100 exec/s: 0 rss: 67Mb L: 38/38 MS: 3 CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:09:37.558 [2024-10-04 08:27:30.013806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.013840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.558 #21 NEW cov: 11787 ft: 12049 corp: 3/78b lim: 100 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 InsertByte- 00:09:37.558 [2024-10-04 08:27:30.063889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.063920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.558 #22 NEW cov: 11793 ft: 12365 corp: 4/117b lim: 100 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 CopyPart- 00:09:37.558 [2024-10-04 08:27:30.103976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65280 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.104008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.558 #23 NEW cov: 11878 ft: 12629 corp: 5/156b lim: 100 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBit- 00:09:37.558 [2024-10-04 08:27:30.144418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.144447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.558 [2024-10-04 08:27:30.144485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.144500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.558 [2024-10-04 08:27:30.144556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.144576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.558 #28 NEW cov: 11878 ft: 13680 corp: 6/232b lim: 100 exec/s: 0 rss: 67Mb L: 76/76 MS: 5 ChangeBinInt-CrossOver-InsertByte-ChangeBit-InsertRepeatedBytes- 00:09:37.558 [2024-10-04 08:27:30.184192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.184221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.558 #29 NEW cov: 11878 ft: 13820 corp: 7/271b lim: 100 exec/s: 0 rss: 67Mb L: 39/76 MS: 1 ChangeByte- 00:09:37.558 [2024-10-04 08:27:30.224771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861324781479 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.224801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.558 [2024-10-04 08:27:30.224844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.224859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.558 [2024-10-04 08:27:30.224914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.224930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.558 [2024-10-04 08:27:30.224985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.558 [2024-10-04 08:27:30.225000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:37.818 #34 NEW cov: 11878 ft: 14280 corp: 8/353b lim: 100 exec/s: 0 rss: 67Mb L: 82/82 MS: 5 ShuffleBytes-InsertByte-EraseBytes-CrossOver-InsertRepeatedBytes- 00:09:37.818 [2024-10-04 08:27:30.264727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.264755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.264795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:70650219154374656 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.264811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.264864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.264880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.818 #35 NEW cov: 11878 ft: 14326 corp: 9/429b lim: 100 exec/s: 0 rss: 67Mb L: 76/82 MS: 1 ChangeBinInt- 00:09:37.818 [2024-10-04 08:27:30.315062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.315091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.315142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.315158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.315221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.315238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.315297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.315314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:37.818 #36 NEW cov: 11878 ft: 14377 corp: 10/525b lim: 100 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 CopyPart- 00:09:37.818 [2024-10-04 08:27:30.355122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861324781479 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.355151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.355192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.355205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.355262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.355279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.355331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12080771334534571943 len:34182 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.355346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:37.818 #37 NEW cov: 11878 ft: 14405 corp: 11/622b lim: 100 exec/s: 0 rss: 67Mb L: 97/97 MS: 1 InsertRepeatedBytes- 00:09:37.818 [2024-10-04 08:27:30.405029] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.405058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.405121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709505791 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.405137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.818 #38 NEW cov: 11878 ft: 14803 corp: 12/662b lim: 100 exec/s: 0 rss: 67Mb L: 40/97 MS: 1 InsertByte- 00:09:37.818 [2024-10-04 08:27:30.455512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.455541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.455582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.455596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.455650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.455667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.818 [2024-10-04 08:27:30.455726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:37.818 [2024-10-04 08:27:30.455740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:37.818 #39 NEW cov: 11878 ft: 14869 corp: 13/758b lim: 100 exec/s: 0 rss: 68Mb L: 96/97 MS: 1 CopyPart- 00:09:38.078 [2024-10-04 08:27:30.505136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.505165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.078 #40 NEW cov: 11878 ft: 14971 corp: 14/797b lim: 100 exec/s: 0 rss: 68Mb L: 39/97 MS: 1 ShuffleBytes- 00:09:38.078 [2024-10-04 08:27:30.545697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861324781479 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.545726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.078 [2024-10-04 08:27:30.545774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.545790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.078 [2024-10-04 08:27:30.545845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.545861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.078 [2024-10-04 08:27:30.545919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.545935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.078 #41 NEW cov: 11878 ft: 14980 corp: 15/883b lim: 100 exec/s: 0 rss: 68Mb L: 86/97 MS: 1 CMP- DE: "H\000\000\000"- 00:09:38.078 [2024-10-04 08:27:30.585523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.585551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.078 [2024-10-04 08:27:30.585603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709505791 len:65329 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.585619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.078 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:38.078 #42 NEW cov: 11901 ft: 15006 corp: 16/923b lim: 100 exec/s: 0 rss: 68Mb L: 40/97 MS: 1 CopyPart- 00:09:38.078 [2024-10-04 08:27:30.635673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.635702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.078 [2024-10-04 08:27:30.635758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5188146775025732863 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.635774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.078 #43 NEW cov: 11901 ft: 15020 corp: 17/966b lim: 100 exec/s: 0 rss: 68Mb L: 43/97 MS: 1 PersAutoDict- DE: "H\000\000\000"- 00:09:38.078 [2024-10-04 08:27:30.685695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.685723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.078 #44 NEW cov: 11901 ft: 15091 corp: 18/1005b lim: 100 exec/s: 0 rss: 68Mb L: 39/97 MS: 1 ChangeBinInt- 00:09:38.078 [2024-10-04 08:27:30.726250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.726279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.078 [2024-10-04 08:27:30.726328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.726344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.078 [2024-10-04 08:27:30.726396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.078 [2024-10-04 08:27:30.726413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.078 [2024-10-04 08:27:30.726466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.079 [2024-10-04 08:27:30.726482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.079 #45 NEW cov: 11901 ft: 15131 corp: 19/1101b lim: 100 exec/s: 45 rss: 68Mb L: 96/97 MS: 1 CopyPart- 00:09:38.338 [2024-10-04 08:27:30.776427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.338 [2024-10-04 08:27:30.776456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.338 [2024-10-04 08:27:30.776496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.338 [2024-10-04 08:27:30.776511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.338 [2024-10-04 08:27:30.776564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.338 [2024-10-04 08:27:30.776581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.338 [2024-10-04 08:27:30.776636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.338 [2024-10-04 08:27:30.776651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.338 #46 NEW cov: 11901 ft: 15140 corp: 20/1197b lim: 100 exec/s: 46 rss: 68Mb L: 96/97 MS: 1 ChangeByte- 00:09:38.338 [2024-10-04 08:27:30.816531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.816559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.339 [2024-10-04 08:27:30.816603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.816619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.339 [2024-10-04 08:27:30.816676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.816694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.339 [2024-10-04 08:27:30.816749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.816765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.339 #47 NEW cov: 11901 ft: 15200 corp: 21/1294b lim: 100 exec/s: 47 rss: 68Mb L: 97/97 MS: 1 CopyPart- 00:09:38.339 [2024-10-04 08:27:30.856308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.856336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.339 [2024-10-04 08:27:30.856388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5188146775025732863 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.856405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.339 #48 NEW cov: 11901 ft: 15222 corp: 22/1337b lim: 100 exec/s: 48 rss: 68Mb L: 43/97 MS: 1 ShuffleBytes- 00:09:38.339 [2024-10-04 08:27:30.906791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861324781479 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.906820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.339 [2024-10-04 08:27:30.906864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.906880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.339 [2024-10-04 08:27:30.906933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.906950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.339 [2024-10-04 08:27:30.907006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.907022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.339 #49 NEW cov: 11901 ft: 15242 corp: 23/1419b lim: 100 exec/s: 49 rss: 68Mb L: 82/97 MS: 1 ShuffleBytes- 00:09:38.339 [2024-10-04 08:27:30.946562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.946590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.339 [2024-10-04 08:27:30.946626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5188146775025732863 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.946642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.339 #50 NEW cov: 11901 ft: 15263 corp: 24/1462b lim: 100 exec/s: 50 rss: 68Mb L: 43/97 MS: 1 ChangeBit- 00:09:38.339 [2024-10-04 08:27:30.986508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073692839935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.339 [2024-10-04 08:27:30.986537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.339 #51 NEW cov: 11901 ft: 15309 corp: 25/1500b lim: 100 exec/s: 51 rss: 68Mb L: 38/97 MS: 1 ChangeBinInt- 00:09:38.599 [2024-10-04 08:27:31.027109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861324781479 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.027138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.027192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.027208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.027265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.027282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.027338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.027354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.599 #52 NEW cov: 11901 ft: 15326 corp: 26/1586b lim: 100 exec/s: 52 rss: 68Mb L: 86/97 MS: 1 ShuffleBytes- 00:09:38.599 [2024-10-04 08:27:31.077315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.077344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.077383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.077400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.077454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.077469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.077526] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.077542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.599 #53 NEW cov: 11901 ft: 15336 corp: 27/1682b lim: 100 exec/s: 53 rss: 68Mb L: 96/97 MS: 1 ShuffleBytes- 00:09:38.599 [2024-10-04 08:27:31.117398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861324781479 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.117427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.117470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.117486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.117543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.117559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.117617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.117632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.599 #54 NEW cov: 11901 ft: 15348 corp: 28/1765b lim: 100 exec/s: 54 rss: 68Mb L: 83/97 MS: 1 CrossOver- 00:09:38.599 [2024-10-04 08:27:31.167222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073692839935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.167250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.167288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.167305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.599 #55 NEW cov: 11901 ft: 15377 corp: 29/1807b lim: 100 exec/s: 55 rss: 68Mb L: 42/97 MS: 1 PersAutoDict- DE: "H\000\000\000"- 00:09:38.599 [2024-10-04 08:27:31.217221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.217250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.599 #56 NEW cov: 11901 ft: 15423 corp: 30/1846b lim: 100 exec/s: 56 rss: 68Mb L: 39/97 MS: 1 CopyPart- 00:09:38.599 [2024-10-04 08:27:31.257800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861324781479 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.257828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.257869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.257884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.257938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.257954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.599 [2024-10-04 08:27:31.258007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.599 [2024-10-04 08:27:31.258024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.599 #57 NEW cov: 11901 ft: 15440 corp: 31/1932b lim: 100 exec/s: 57 rss: 68Mb L: 86/97 MS: 1 ChangeBit- 00:09:38.859 [2024-10-04 08:27:31.297627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.297655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.297701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18394952682289746508 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.297717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.859 #58 NEW cov: 11901 ft: 15448 corp: 32/1976b lim: 100 exec/s: 58 rss: 69Mb L: 44/97 MS: 1 InsertByte- 00:09:38.859 [2024-10-04 08:27:31.338033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.338062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.338109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.338125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.338180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.338200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.338254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.338271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.859 #59 NEW cov: 11901 ft: 15455 corp: 33/2072b lim: 100 exec/s: 59 rss: 69Mb L: 96/97 MS: 1 ChangeBit- 00:09:38.859 [2024-10-04 08:27:31.387714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808865440988938 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.387743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.859 #60 NEW cov: 11901 ft: 15460 corp: 34/2111b lim: 100 exec/s: 60 rss: 69Mb L: 39/97 MS: 1 CrossOver- 00:09:38.859 [2024-10-04 08:27:31.418123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.418151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.418205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.418222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.418276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.418291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.859 #61 NEW cov: 11901 ft: 15470 corp: 35/2190b lim: 100 exec/s: 61 rss: 69Mb L: 79/97 MS: 1 EraseBytes- 00:09:38.859 [2024-10-04 08:27:31.458394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4026531825 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.458423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.458472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.458488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.458541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.458558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.458612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.458633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:38.859 #62 NEW cov: 11901 ft: 15494 corp: 36/2288b lim: 100 exec/s: 62 rss: 69Mb L: 98/98 MS: 1 CopyPart- 00:09:38.859 [2024-10-04 08:27:31.498164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.498198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:38.859 [2024-10-04 08:27:31.498241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5188146775025732863 len:18433 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.498257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:38.859 #63 NEW cov: 11901 ft: 15499 corp: 37/2331b lim: 100 exec/s: 63 rss: 69Mb L: 43/98 MS: 1 PersAutoDict- DE: "H\000\000\000"- 00:09:38.859 [2024-10-04 08:27:31.538131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073692839935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:38.859 [2024-10-04 08:27:31.538160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.119 #64 NEW cov: 11901 ft: 15501 corp: 38/2369b lim: 100 exec/s: 64 rss: 69Mb L: 38/98 MS: 1 ChangeBit- 00:09:39.119 [2024-10-04 08:27:31.578695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.119 [2024-10-04 08:27:31.578724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.119 [2024-10-04 08:27:31.578762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18377685569435028735 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.119 [2024-10-04 08:27:31.578777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:39.119 [2024-10-04 08:27:31.578833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.119 [2024-10-04 08:27:31.578850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:39.119 [2024-10-04 08:27:31.578904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446743694270244863 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.119 [2024-10-04 08:27:31.578920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:39.119 #70 NEW cov: 11901 ft: 15505 corp: 39/2457b lim: 100 exec/s: 70 rss: 69Mb L: 88/98 MS: 1 CrossOver- 00:09:39.119 [2024-10-04 08:27:31.618358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1099494915912 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.119 [2024-10-04 08:27:31.618386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.119 #71 NEW cov: 11901 ft: 15573 corp: 40/2495b lim: 100 exec/s: 71 rss: 69Mb L: 38/98 MS: 1 PersAutoDict- DE: "H\000\000\000"- 00:09:39.119 [2024-10-04 08:27:31.658815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861324781479 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.119 [2024-10-04 08:27:31.658843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.119 [2024-10-04 08:27:31.658884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.119 [2024-10-04 08:27:31.658903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:39.119 [2024-10-04 08:27:31.658958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.120 [2024-10-04 08:27:31.658974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:39.120 #72 NEW cov: 11901 ft: 15615 corp: 41/2556b lim: 100 exec/s: 72 rss: 69Mb L: 61/98 MS: 1 EraseBytes- 00:09:39.120 [2024-10-04 08:27:31.698721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.120 [2024-10-04 08:27:31.698749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.120 [2024-10-04 08:27:31.698786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5188146770730811435 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:39.120 [2024-10-04 08:27:31.698801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:39.120 #73 NEW cov: 11901 ft: 15630 corp: 42/2599b lim: 100 exec/s: 36 rss: 69Mb L: 43/98 MS: 1 ChangeBinInt- 00:09:39.120 #73 DONE cov: 11901 ft: 15630 corp: 42/2599b lim: 100 exec/s: 36 rss: 69Mb 00:09:39.120 ###### Recommended dictionary. ###### 00:09:39.120 "H\000\000\000" # Uses: 4 00:09:39.120 ###### End of recommended dictionary. ###### 00:09:39.120 Done 73 runs in 2 second(s) 00:09:39.380 08:27:31 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:09:39.380 08:27:31 -- ../common.sh@72 -- # (( i++ )) 00:09:39.380 08:27:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:39.380 08:27:31 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:09:39.380 00:09:39.380 real 1m2.375s 00:09:39.380 user 1m39.203s 00:09:39.380 sys 0m7.022s 00:09:39.380 08:27:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:39.380 08:27:31 -- common/autotest_common.sh@10 -- # set +x 00:09:39.380 ************************************ 00:09:39.380 END TEST nvmf_fuzz 00:09:39.380 ************************************ 00:09:39.380 08:27:31 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:09:39.380 08:27:31 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:09:39.380 08:27:31 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:39.380 08:27:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:39.380 08:27:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:39.380 08:27:31 -- common/autotest_common.sh@10 -- # set +x 00:09:39.380 ************************************ 00:09:39.380 START TEST vfio_fuzz 00:09:39.380 ************************************ 00:09:39.380 08:27:31 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:39.380 * Looking for test storage... 00:09:39.380 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:39.380 08:27:31 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:09:39.380 08:27:31 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:09:39.380 08:27:31 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:09:39.380 08:27:31 -- common/autotest_common.sh@34 -- # set -e 00:09:39.380 08:27:31 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:09:39.380 08:27:31 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:09:39.380 08:27:31 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:09:39.380 08:27:31 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:09:39.380 08:27:31 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:09:39.380 08:27:31 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:09:39.380 08:27:31 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:09:39.380 08:27:31 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:09:39.380 08:27:31 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:09:39.380 08:27:31 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:09:39.380 08:27:31 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:09:39.380 08:27:31 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:09:39.380 08:27:31 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:09:39.380 08:27:31 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:09:39.380 08:27:31 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:09:39.380 08:27:31 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:09:39.380 08:27:31 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:09:39.380 08:27:31 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:09:39.380 08:27:31 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:09:39.380 08:27:31 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:09:39.380 08:27:31 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:09:39.380 08:27:31 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:09:39.380 08:27:31 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:39.380 08:27:31 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:09:39.380 08:27:31 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:09:39.380 08:27:31 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:09:39.380 08:27:31 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:09:39.380 08:27:31 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:09:39.380 08:27:31 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:09:39.380 08:27:31 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:09:39.380 08:27:31 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:09:39.380 08:27:31 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:09:39.380 08:27:31 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:09:39.380 08:27:31 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:09:39.380 08:27:31 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:09:39.380 08:27:31 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:09:39.380 08:27:31 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:09:39.380 08:27:31 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:09:39.381 08:27:31 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:09:39.381 08:27:31 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:39.381 08:27:31 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:09:39.381 08:27:31 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:09:39.381 08:27:31 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:09:39.381 08:27:31 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:09:39.381 08:27:31 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:39.381 08:27:31 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:09:39.381 08:27:31 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:09:39.381 08:27:31 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:09:39.381 08:27:31 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:09:39.381 08:27:31 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:09:39.381 08:27:31 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:09:39.381 08:27:31 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:09:39.381 08:27:31 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:09:39.381 08:27:31 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:09:39.381 08:27:32 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:09:39.381 08:27:32 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:09:39.381 08:27:32 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:09:39.381 08:27:32 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:09:39.381 08:27:32 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:09:39.381 08:27:32 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:09:39.381 08:27:32 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:09:39.381 08:27:32 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:09:39.381 08:27:32 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:09:39.381 08:27:32 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:09:39.381 08:27:32 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:39.381 08:27:32 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:09:39.381 08:27:32 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:09:39.381 08:27:32 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:09:39.381 08:27:32 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:09:39.381 08:27:32 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:09:39.381 08:27:32 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:09:39.381 08:27:32 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:09:39.381 08:27:32 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:09:39.381 08:27:32 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:09:39.381 08:27:32 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:09:39.381 08:27:32 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:09:39.381 08:27:32 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:09:39.381 08:27:32 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:09:39.381 08:27:32 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:09:39.381 08:27:32 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:09:39.381 08:27:32 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:09:39.381 08:27:32 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:09:39.381 08:27:32 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:09:39.381 08:27:32 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:39.381 08:27:32 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:39.381 08:27:32 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:39.381 08:27:32 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:39.381 08:27:32 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:39.381 08:27:32 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:39.381 08:27:32 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:39.381 08:27:32 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:39.381 08:27:32 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:09:39.381 08:27:32 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:09:39.381 08:27:32 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:09:39.381 08:27:32 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:09:39.381 08:27:32 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:09:39.381 08:27:32 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:09:39.381 08:27:32 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:09:39.381 08:27:32 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:09:39.381 #define SPDK_CONFIG_H 00:09:39.381 #define SPDK_CONFIG_APPS 1 00:09:39.381 #define SPDK_CONFIG_ARCH native 00:09:39.381 #undef SPDK_CONFIG_ASAN 00:09:39.381 #undef SPDK_CONFIG_AVAHI 00:09:39.381 #undef SPDK_CONFIG_CET 00:09:39.381 #define SPDK_CONFIG_COVERAGE 1 00:09:39.381 #define SPDK_CONFIG_CROSS_PREFIX 00:09:39.381 #undef SPDK_CONFIG_CRYPTO 00:09:39.381 #undef SPDK_CONFIG_CRYPTO_MLX5 00:09:39.381 #undef SPDK_CONFIG_CUSTOMOCF 00:09:39.381 #undef SPDK_CONFIG_DAOS 00:09:39.381 #define SPDK_CONFIG_DAOS_DIR 00:09:39.381 #define SPDK_CONFIG_DEBUG 1 00:09:39.381 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:09:39.381 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:39.381 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:39.381 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:39.381 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:09:39.381 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:39.381 #define SPDK_CONFIG_EXAMPLES 1 00:09:39.381 #undef SPDK_CONFIG_FC 00:09:39.381 #define SPDK_CONFIG_FC_PATH 00:09:39.381 #define SPDK_CONFIG_FIO_PLUGIN 1 00:09:39.381 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:09:39.381 #undef SPDK_CONFIG_FUSE 00:09:39.381 #define SPDK_CONFIG_FUZZER 1 00:09:39.381 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:09:39.381 #undef SPDK_CONFIG_GOLANG 00:09:39.381 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:09:39.381 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:09:39.381 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:09:39.381 #undef SPDK_CONFIG_HAVE_LIBBSD 00:09:39.381 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:09:39.381 #define SPDK_CONFIG_IDXD 1 00:09:39.381 #define SPDK_CONFIG_IDXD_KERNEL 1 00:09:39.381 #undef SPDK_CONFIG_IPSEC_MB 00:09:39.381 #define SPDK_CONFIG_IPSEC_MB_DIR 00:09:39.381 #define SPDK_CONFIG_ISAL 1 00:09:39.381 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:09:39.381 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:09:39.381 #define SPDK_CONFIG_LIBDIR 00:09:39.381 #undef SPDK_CONFIG_LTO 00:09:39.381 #define SPDK_CONFIG_MAX_LCORES 00:09:39.381 #define SPDK_CONFIG_NVME_CUSE 1 00:09:39.381 #undef SPDK_CONFIG_OCF 00:09:39.381 #define SPDK_CONFIG_OCF_PATH 00:09:39.381 #define SPDK_CONFIG_OPENSSL_PATH 00:09:39.381 #undef SPDK_CONFIG_PGO_CAPTURE 00:09:39.381 #undef SPDK_CONFIG_PGO_USE 00:09:39.381 #define SPDK_CONFIG_PREFIX /usr/local 00:09:39.381 #undef SPDK_CONFIG_RAID5F 00:09:39.381 #undef SPDK_CONFIG_RBD 00:09:39.381 #define SPDK_CONFIG_RDMA 1 00:09:39.381 #define SPDK_CONFIG_RDMA_PROV verbs 00:09:39.381 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:09:39.381 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:09:39.381 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:09:39.381 #undef SPDK_CONFIG_SHARED 00:09:39.381 #undef SPDK_CONFIG_SMA 00:09:39.381 #define SPDK_CONFIG_TESTS 1 00:09:39.381 #undef SPDK_CONFIG_TSAN 00:09:39.382 #define SPDK_CONFIG_UBLK 1 00:09:39.382 #define SPDK_CONFIG_UBSAN 1 00:09:39.382 #undef SPDK_CONFIG_UNIT_TESTS 00:09:39.382 #undef SPDK_CONFIG_URING 00:09:39.382 #define SPDK_CONFIG_URING_PATH 00:09:39.382 #undef SPDK_CONFIG_URING_ZNS 00:09:39.382 #undef SPDK_CONFIG_USDT 00:09:39.382 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:09:39.382 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:09:39.382 #define SPDK_CONFIG_VFIO_USER 1 00:09:39.382 #define SPDK_CONFIG_VFIO_USER_DIR 00:09:39.382 #define SPDK_CONFIG_VHOST 1 00:09:39.382 #define SPDK_CONFIG_VIRTIO 1 00:09:39.382 #undef SPDK_CONFIG_VTUNE 00:09:39.382 #define SPDK_CONFIG_VTUNE_DIR 00:09:39.382 #define SPDK_CONFIG_WERROR 1 00:09:39.382 #define SPDK_CONFIG_WPDK_DIR 00:09:39.382 #undef SPDK_CONFIG_XNVME 00:09:39.382 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:09:39.382 08:27:32 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:09:39.382 08:27:32 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:39.382 08:27:32 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:39.382 08:27:32 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:39.382 08:27:32 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:39.382 08:27:32 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:39.382 08:27:32 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:39.382 08:27:32 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:39.382 08:27:32 -- paths/export.sh@5 -- # export PATH 00:09:39.382 08:27:32 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:39.382 08:27:32 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:39.382 08:27:32 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:39.382 08:27:32 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:39.382 08:27:32 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:39.382 08:27:32 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:09:39.382 08:27:32 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:39.382 08:27:32 -- pm/common@16 -- # TEST_TAG=N/A 00:09:39.382 08:27:32 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:09:39.382 08:27:32 -- common/autotest_common.sh@52 -- # : 1 00:09:39.382 08:27:32 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:09:39.382 08:27:32 -- common/autotest_common.sh@56 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:09:39.382 08:27:32 -- common/autotest_common.sh@58 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:09:39.382 08:27:32 -- common/autotest_common.sh@60 -- # : 1 00:09:39.382 08:27:32 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:09:39.382 08:27:32 -- common/autotest_common.sh@62 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:09:39.382 08:27:32 -- common/autotest_common.sh@64 -- # : 00:09:39.382 08:27:32 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:09:39.382 08:27:32 -- common/autotest_common.sh@66 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:09:39.382 08:27:32 -- common/autotest_common.sh@68 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:09:39.382 08:27:32 -- common/autotest_common.sh@70 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:09:39.382 08:27:32 -- common/autotest_common.sh@72 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:09:39.382 08:27:32 -- common/autotest_common.sh@74 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:09:39.382 08:27:32 -- common/autotest_common.sh@76 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:09:39.382 08:27:32 -- common/autotest_common.sh@78 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:09:39.382 08:27:32 -- common/autotest_common.sh@80 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:09:39.382 08:27:32 -- common/autotest_common.sh@82 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:09:39.382 08:27:32 -- common/autotest_common.sh@84 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:09:39.382 08:27:32 -- common/autotest_common.sh@86 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:09:39.382 08:27:32 -- common/autotest_common.sh@88 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:09:39.382 08:27:32 -- common/autotest_common.sh@90 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:09:39.382 08:27:32 -- common/autotest_common.sh@92 -- # : 1 00:09:39.382 08:27:32 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:09:39.382 08:27:32 -- common/autotest_common.sh@94 -- # : 1 00:09:39.382 08:27:32 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:09:39.382 08:27:32 -- common/autotest_common.sh@96 -- # : rdma 00:09:39.382 08:27:32 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:09:39.382 08:27:32 -- common/autotest_common.sh@98 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:09:39.382 08:27:32 -- common/autotest_common.sh@100 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:09:39.382 08:27:32 -- common/autotest_common.sh@102 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:09:39.382 08:27:32 -- common/autotest_common.sh@104 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:09:39.382 08:27:32 -- common/autotest_common.sh@106 -- # : 0 00:09:39.382 08:27:32 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:09:39.382 08:27:32 -- common/autotest_common.sh@108 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:09:39.643 08:27:32 -- common/autotest_common.sh@110 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:09:39.643 08:27:32 -- common/autotest_common.sh@112 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:09:39.643 08:27:32 -- common/autotest_common.sh@114 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:09:39.643 08:27:32 -- common/autotest_common.sh@116 -- # : 1 00:09:39.643 08:27:32 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:09:39.643 08:27:32 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:39.643 08:27:32 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:09:39.643 08:27:32 -- common/autotest_common.sh@120 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:09:39.643 08:27:32 -- common/autotest_common.sh@122 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:09:39.643 08:27:32 -- common/autotest_common.sh@124 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:09:39.643 08:27:32 -- common/autotest_common.sh@126 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:09:39.643 08:27:32 -- common/autotest_common.sh@128 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:09:39.643 08:27:32 -- common/autotest_common.sh@130 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:09:39.643 08:27:32 -- common/autotest_common.sh@132 -- # : v22.11.4 00:09:39.643 08:27:32 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:09:39.643 08:27:32 -- common/autotest_common.sh@134 -- # : true 00:09:39.643 08:27:32 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:09:39.643 08:27:32 -- common/autotest_common.sh@136 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:09:39.643 08:27:32 -- common/autotest_common.sh@138 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:09:39.643 08:27:32 -- common/autotest_common.sh@140 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:09:39.643 08:27:32 -- common/autotest_common.sh@142 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:09:39.643 08:27:32 -- common/autotest_common.sh@144 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:09:39.643 08:27:32 -- common/autotest_common.sh@146 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:09:39.643 08:27:32 -- common/autotest_common.sh@148 -- # : 00:09:39.643 08:27:32 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:09:39.643 08:27:32 -- common/autotest_common.sh@150 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:09:39.643 08:27:32 -- common/autotest_common.sh@152 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:09:39.643 08:27:32 -- common/autotest_common.sh@154 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:09:39.643 08:27:32 -- common/autotest_common.sh@156 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:09:39.643 08:27:32 -- common/autotest_common.sh@158 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:09:39.643 08:27:32 -- common/autotest_common.sh@160 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:09:39.643 08:27:32 -- common/autotest_common.sh@163 -- # : 00:09:39.643 08:27:32 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:09:39.643 08:27:32 -- common/autotest_common.sh@165 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:09:39.643 08:27:32 -- common/autotest_common.sh@167 -- # : 0 00:09:39.643 08:27:32 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:09:39.643 08:27:32 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:39.643 08:27:32 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:39.643 08:27:32 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:39.643 08:27:32 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:39.643 08:27:32 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:39.643 08:27:32 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:39.644 08:27:32 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:39.644 08:27:32 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:39.644 08:27:32 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:09:39.644 08:27:32 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:09:39.644 08:27:32 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:39.644 08:27:32 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:39.644 08:27:32 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:09:39.644 08:27:32 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:09:39.644 08:27:32 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:39.644 08:27:32 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:39.644 08:27:32 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:39.644 08:27:32 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:39.644 08:27:32 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:09:39.644 08:27:32 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:09:39.644 08:27:32 -- common/autotest_common.sh@196 -- # cat 00:09:39.644 08:27:32 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:09:39.644 08:27:32 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:39.644 08:27:32 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:39.644 08:27:32 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:39.644 08:27:32 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:39.644 08:27:32 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:09:39.644 08:27:32 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:09:39.644 08:27:32 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:39.644 08:27:32 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:39.644 08:27:32 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:39.644 08:27:32 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:39.644 08:27:32 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:39.644 08:27:32 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:39.644 08:27:32 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:39.644 08:27:32 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:39.644 08:27:32 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:39.644 08:27:32 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:39.644 08:27:32 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:39.644 08:27:32 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:39.644 08:27:32 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:09:39.644 08:27:32 -- common/autotest_common.sh@249 -- # export valgrind= 00:09:39.644 08:27:32 -- common/autotest_common.sh@249 -- # valgrind= 00:09:39.644 08:27:32 -- common/autotest_common.sh@255 -- # uname -s 00:09:39.644 08:27:32 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:09:39.644 08:27:32 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:09:39.644 08:27:32 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:09:39.644 08:27:32 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:09:39.644 08:27:32 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:09:39.644 08:27:32 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:09:39.644 08:27:32 -- common/autotest_common.sh@265 -- # MAKE=make 00:09:39.644 08:27:32 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:09:39.644 08:27:32 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:09:39.644 08:27:32 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:09:39.644 08:27:32 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:09:39.644 08:27:32 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:09:39.644 08:27:32 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:09:39.644 08:27:32 -- common/autotest_common.sh@309 -- # [[ -z 1023786 ]] 00:09:39.644 08:27:32 -- common/autotest_common.sh@309 -- # kill -0 1023786 00:09:39.644 08:27:32 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:09:39.644 08:27:32 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:09:39.644 08:27:32 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:09:39.644 08:27:32 -- common/autotest_common.sh@322 -- # local mount target_dir 00:09:39.644 08:27:32 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:09:39.644 08:27:32 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:09:39.644 08:27:32 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:09:39.644 08:27:32 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:09:39.644 08:27:32 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.f1zZWK 00:09:39.644 08:27:32 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:09:39.644 08:27:32 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:09:39.644 08:27:32 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:09:39.644 08:27:32 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.f1zZWK/tests/vfio /tmp/spdk.f1zZWK 00:09:39.644 08:27:32 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:09:39.644 08:27:32 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:39.644 08:27:32 -- common/autotest_common.sh@318 -- # df -T 00:09:39.644 08:27:32 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:09:39.644 08:27:32 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:09:39.644 08:27:32 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:09:39.644 08:27:32 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:09:39.644 08:27:32 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:09:39.644 08:27:32 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:09:39.644 08:27:32 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:39.644 08:27:32 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:09:39.644 08:27:32 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:09:39.644 08:27:32 -- common/autotest_common.sh@353 -- # avails["$mount"]=678330368 00:09:39.644 08:27:32 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:09:39.645 08:27:32 -- common/autotest_common.sh@354 -- # uses["$mount"]=4606099456 00:09:39.645 08:27:32 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # avails["$mount"]=52128038912 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61730590720 00:09:39.645 08:27:32 -- common/autotest_common.sh@354 -- # uses["$mount"]=9602551808 00:09:39.645 08:27:32 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # avails["$mount"]=30864035840 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30865293312 00:09:39.645 08:27:32 -- common/autotest_common.sh@354 -- # uses["$mount"]=1257472 00:09:39.645 08:27:32 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # avails["$mount"]=12340125696 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12346118144 00:09:39.645 08:27:32 -- common/autotest_common.sh@354 -- # uses["$mount"]=5992448 00:09:39.645 08:27:32 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # avails["$mount"]=30864527360 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30865297408 00:09:39.645 08:27:32 -- common/autotest_common.sh@354 -- # uses["$mount"]=770048 00:09:39.645 08:27:32 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:39.645 08:27:32 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # avails["$mount"]=6173044736 00:09:39.645 08:27:32 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6173057024 00:09:39.645 08:27:32 -- common/autotest_common.sh@354 -- # uses["$mount"]=12288 00:09:39.645 08:27:32 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:39.645 08:27:32 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:09:39.645 * Looking for test storage... 00:09:39.645 08:27:32 -- common/autotest_common.sh@359 -- # local target_space new_size 00:09:39.645 08:27:32 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:09:39.645 08:27:32 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:39.645 08:27:32 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:09:39.645 08:27:32 -- common/autotest_common.sh@363 -- # mount=/ 00:09:39.645 08:27:32 -- common/autotest_common.sh@365 -- # target_space=52128038912 00:09:39.645 08:27:32 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:09:39.645 08:27:32 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:09:39.645 08:27:32 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:09:39.645 08:27:32 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:09:39.645 08:27:32 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:09:39.645 08:27:32 -- common/autotest_common.sh@372 -- # new_size=11817144320 00:09:39.645 08:27:32 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:09:39.645 08:27:32 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:39.645 08:27:32 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:39.645 08:27:32 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:39.645 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:39.645 08:27:32 -- common/autotest_common.sh@380 -- # return 0 00:09:39.645 08:27:32 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:09:39.645 08:27:32 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:09:39.645 08:27:32 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:09:39.645 08:27:32 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:09:39.645 08:27:32 -- common/autotest_common.sh@1672 -- # true 00:09:39.645 08:27:32 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:09:39.645 08:27:32 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:09:39.645 08:27:32 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:09:39.645 08:27:32 -- common/autotest_common.sh@27 -- # exec 00:09:39.645 08:27:32 -- common/autotest_common.sh@29 -- # exec 00:09:39.645 08:27:32 -- common/autotest_common.sh@31 -- # xtrace_restore 00:09:39.645 08:27:32 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:09:39.645 08:27:32 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:09:39.645 08:27:32 -- common/autotest_common.sh@18 -- # set -x 00:09:39.645 08:27:32 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:09:39.645 08:27:32 -- ../common.sh@8 -- # pids=() 00:09:39.645 08:27:32 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:39.645 08:27:32 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:39.645 08:27:32 -- vfio/run.sh@59 -- # fuzz_num=7 00:09:39.645 08:27:32 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:09:39.645 08:27:32 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:09:39.645 08:27:32 -- vfio/run.sh@65 -- # mem_size=0 00:09:39.645 08:27:32 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:09:39.645 08:27:32 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:09:39.645 08:27:32 -- ../common.sh@69 -- # local fuzz_num=7 00:09:39.645 08:27:32 -- ../common.sh@70 -- # local time=1 00:09:39.645 08:27:32 -- ../common.sh@72 -- # (( i = 0 )) 00:09:39.645 08:27:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:39.645 08:27:32 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:09:39.645 08:27:32 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:09:39.645 08:27:32 -- vfio/run.sh@23 -- # local timen=1 00:09:39.645 08:27:32 -- vfio/run.sh@24 -- # local core=0x1 00:09:39.645 08:27:32 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:39.645 08:27:32 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:09:39.645 08:27:32 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:09:39.645 08:27:32 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:09:39.645 08:27:32 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:09:39.645 08:27:32 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:39.645 08:27:32 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:09:39.645 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:39.645 08:27:32 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:09:39.645 [2024-10-04 08:27:32.191615] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:39.645 [2024-10-04 08:27:32.191693] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1023827 ] 00:09:39.646 EAL: No free 2048 kB hugepages reported on node 1 00:09:39.646 [2024-10-04 08:27:32.262848] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:39.646 [2024-10-04 08:27:32.299451] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:39.646 [2024-10-04 08:27:32.299599] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:39.905 INFO: Running with entropic power schedule (0xFF, 100). 00:09:39.905 INFO: Seed: 529043812 00:09:39.905 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:39.905 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:39.905 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:39.905 INFO: A corpus is not provided, starting from an empty corpus 00:09:39.905 #2 INITED exec/s: 0 rss: 60Mb 00:09:39.905 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:39.905 This may also happen if the target rejected all inputs we tried so far 00:09:40.423 NEW_FUNC[1/631]: 0x450dd8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:09:40.423 NEW_FUNC[2/631]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:40.423 #6 NEW cov: 10762 ft: 10595 corp: 2/11b lim: 60 exec/s: 0 rss: 65Mb L: 10/10 MS: 4 ChangeByte-CrossOver-ChangeByte-CMP- DE: "\000\000\000\000\000\004\000\000"- 00:09:40.683 #7 NEW cov: 10779 ft: 13674 corp: 3/21b lim: 60 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 ChangeByte- 00:09:40.942 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:40.942 #8 NEW cov: 10796 ft: 15256 corp: 4/30b lim: 60 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 EraseBytes- 00:09:40.942 #9 NEW cov: 10796 ft: 15928 corp: 5/39b lim: 60 exec/s: 9 rss: 68Mb L: 9/10 MS: 1 CrossOver- 00:09:41.201 #10 NEW cov: 10796 ft: 17121 corp: 6/49b lim: 60 exec/s: 10 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:09:41.461 #11 NEW cov: 10796 ft: 17324 corp: 7/68b lim: 60 exec/s: 11 rss: 68Mb L: 19/19 MS: 1 CrossOver- 00:09:41.461 #12 NEW cov: 10796 ft: 17639 corp: 8/79b lim: 60 exec/s: 12 rss: 68Mb L: 11/19 MS: 1 CrossOver- 00:09:41.721 #14 NEW cov: 10803 ft: 17955 corp: 9/89b lim: 60 exec/s: 14 rss: 68Mb L: 10/19 MS: 2 CrossOver-CopyPart- 00:09:41.985 #15 NEW cov: 10803 ft: 18079 corp: 10/99b lim: 60 exec/s: 7 rss: 68Mb L: 10/19 MS: 1 ShuffleBytes- 00:09:41.985 #15 DONE cov: 10803 ft: 18079 corp: 10/99b lim: 60 exec/s: 7 rss: 68Mb 00:09:41.985 ###### Recommended dictionary. ###### 00:09:41.985 "\000\000\000\000\000\004\000\000" # Uses: 0 00:09:41.985 ###### End of recommended dictionary. ###### 00:09:41.985 Done 15 runs in 2 second(s) 00:09:42.245 08:27:34 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:09:42.245 08:27:34 -- ../common.sh@72 -- # (( i++ )) 00:09:42.245 08:27:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:42.245 08:27:34 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:42.245 08:27:34 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:09:42.245 08:27:34 -- vfio/run.sh@23 -- # local timen=1 00:09:42.245 08:27:34 -- vfio/run.sh@24 -- # local core=0x1 00:09:42.245 08:27:34 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:42.245 08:27:34 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:09:42.245 08:27:34 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:09:42.245 08:27:34 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:09:42.245 08:27:34 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:09:42.245 08:27:34 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:42.245 08:27:34 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:09:42.245 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:42.245 08:27:34 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:09:42.245 [2024-10-04 08:27:34.816795] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:42.245 [2024-10-04 08:27:34.816890] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1024376 ] 00:09:42.245 EAL: No free 2048 kB hugepages reported on node 1 00:09:42.245 [2024-10-04 08:27:34.887734] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:42.245 [2024-10-04 08:27:34.922296] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:42.245 [2024-10-04 08:27:34.922451] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:42.505 INFO: Running with entropic power schedule (0xFF, 100). 00:09:42.505 INFO: Seed: 3150038147 00:09:42.505 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:42.505 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:42.505 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:42.505 INFO: A corpus is not provided, starting from an empty corpus 00:09:42.505 #2 INITED exec/s: 0 rss: 59Mb 00:09:42.505 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:42.505 This may also happen if the target rejected all inputs we tried so far 00:09:42.505 [2024-10-04 08:27:35.175205] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:42.505 [2024-10-04 08:27:35.175260] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:42.505 [2024-10-04 08:27:35.175282] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.024 NEW_FUNC[1/638]: 0x451378 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:09:43.024 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:43.024 #6 NEW cov: 10779 ft: 10422 corp: 2/33b lim: 40 exec/s: 0 rss: 66Mb L: 32/32 MS: 4 ChangeByte-CrossOver-ChangeBit-InsertRepeatedBytes- 00:09:43.024 [2024-10-04 08:27:35.579906] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.024 [2024-10-04 08:27:35.579947] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.024 [2024-10-04 08:27:35.579967] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.024 #7 NEW cov: 10796 ft: 12963 corp: 3/54b lim: 40 exec/s: 0 rss: 67Mb L: 21/32 MS: 1 EraseBytes- 00:09:43.024 [2024-10-04 08:27:35.694721] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.024 [2024-10-04 08:27:35.694751] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.024 [2024-10-04 08:27:35.694770] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.284 #10 NEW cov: 10796 ft: 14314 corp: 4/59b lim: 40 exec/s: 0 rss: 68Mb L: 5/32 MS: 3 ChangeByte-CrossOver-CMP- DE: "\011\000\000\000"- 00:09:43.284 [2024-10-04 08:27:35.819627] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.284 [2024-10-04 08:27:35.819656] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.284 [2024-10-04 08:27:35.819676] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.284 #14 NEW cov: 10796 ft: 15217 corp: 5/65b lim: 40 exec/s: 0 rss: 68Mb L: 6/32 MS: 4 EraseBytes-ChangeBinInt-EraseBytes-PersAutoDict- DE: "\011\000\000\000"- 00:09:43.284 [2024-10-04 08:27:35.933522] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.284 [2024-10-04 08:27:35.933550] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.284 [2024-10-04 08:27:35.933569] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.543 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:43.543 #15 NEW cov: 10813 ft: 15546 corp: 6/71b lim: 40 exec/s: 0 rss: 68Mb L: 6/32 MS: 1 CopyPart- 00:09:43.543 [2024-10-04 08:27:36.049464] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.543 [2024-10-04 08:27:36.049491] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.543 [2024-10-04 08:27:36.049509] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.543 #17 NEW cov: 10813 ft: 15647 corp: 7/76b lim: 40 exec/s: 17 rss: 68Mb L: 5/32 MS: 2 ChangeBit-PersAutoDict- DE: "\011\000\000\000"- 00:09:43.543 [2024-10-04 08:27:36.164400] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.543 [2024-10-04 08:27:36.164427] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.543 [2024-10-04 08:27:36.164447] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.802 #18 NEW cov: 10813 ft: 15953 corp: 8/82b lim: 40 exec/s: 18 rss: 68Mb L: 6/32 MS: 1 ShuffleBytes- 00:09:43.802 [2024-10-04 08:27:36.279195] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.802 [2024-10-04 08:27:36.279221] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.802 [2024-10-04 08:27:36.279239] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.802 #19 NEW cov: 10813 ft: 16315 corp: 9/88b lim: 40 exec/s: 19 rss: 68Mb L: 6/32 MS: 1 CopyPart- 00:09:43.802 [2024-10-04 08:27:36.393981] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:43.802 [2024-10-04 08:27:36.394006] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:43.802 [2024-10-04 08:27:36.394029] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:43.802 #20 NEW cov: 10813 ft: 16412 corp: 10/94b lim: 40 exec/s: 20 rss: 68Mb L: 6/32 MS: 1 ChangeBit- 00:09:44.061 [2024-10-04 08:27:36.507945] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:44.061 [2024-10-04 08:27:36.507974] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:44.061 [2024-10-04 08:27:36.507993] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:44.061 #21 NEW cov: 10813 ft: 16577 corp: 11/100b lim: 40 exec/s: 21 rss: 68Mb L: 6/32 MS: 1 ShuffleBytes- 00:09:44.061 [2024-10-04 08:27:36.622684] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:44.061 [2024-10-04 08:27:36.622712] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:44.061 [2024-10-04 08:27:36.622731] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:44.061 #22 NEW cov: 10813 ft: 16797 corp: 12/133b lim: 40 exec/s: 22 rss: 69Mb L: 33/33 MS: 1 InsertByte- 00:09:44.061 [2024-10-04 08:27:36.736555] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:44.061 [2024-10-04 08:27:36.736583] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:44.061 [2024-10-04 08:27:36.736603] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:44.320 #23 NEW cov: 10813 ft: 16962 corp: 13/167b lim: 40 exec/s: 23 rss: 69Mb L: 34/34 MS: 1 CrossOver- 00:09:44.320 [2024-10-04 08:27:36.851495] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:44.320 [2024-10-04 08:27:36.851521] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:44.320 [2024-10-04 08:27:36.851541] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:44.320 #24 NEW cov: 10813 ft: 17037 corp: 14/173b lim: 40 exec/s: 24 rss: 69Mb L: 6/34 MS: 1 InsertByte- 00:09:44.320 [2024-10-04 08:27:36.966346] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:44.320 [2024-10-04 08:27:36.966371] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:44.320 [2024-10-04 08:27:36.966390] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:44.579 #25 NEW cov: 10820 ft: 17088 corp: 15/212b lim: 40 exec/s: 25 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:09:44.579 [2024-10-04 08:27:37.080153] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:44.579 [2024-10-04 08:27:37.080179] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:44.579 [2024-10-04 08:27:37.080202] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:44.579 #26 NEW cov: 10820 ft: 17128 corp: 16/226b lim: 40 exec/s: 13 rss: 69Mb L: 14/39 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:09:44.579 #26 DONE cov: 10820 ft: 17128 corp: 16/226b lim: 40 exec/s: 13 rss: 69Mb 00:09:44.579 ###### Recommended dictionary. ###### 00:09:44.579 "\011\000\000\000" # Uses: 2 00:09:44.579 "\000\000\000\000\000\000\000\000" # Uses: 0 00:09:44.579 ###### End of recommended dictionary. ###### 00:09:44.579 Done 26 runs in 2 second(s) 00:09:44.839 08:27:37 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:09:44.839 08:27:37 -- ../common.sh@72 -- # (( i++ )) 00:09:44.839 08:27:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:44.839 08:27:37 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:44.839 08:27:37 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:44.839 08:27:37 -- vfio/run.sh@23 -- # local timen=1 00:09:44.839 08:27:37 -- vfio/run.sh@24 -- # local core=0x1 00:09:44.839 08:27:37 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:44.839 08:27:37 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:44.839 08:27:37 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:44.839 08:27:37 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:44.839 08:27:37 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:44.839 08:27:37 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:44.839 08:27:37 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:44.839 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:44.839 08:27:37 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:44.839 [2024-10-04 08:27:37.455334] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:44.839 [2024-10-04 08:27:37.455427] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1024770 ] 00:09:44.839 EAL: No free 2048 kB hugepages reported on node 1 00:09:45.098 [2024-10-04 08:27:37.527760] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:45.098 [2024-10-04 08:27:37.563367] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:45.098 [2024-10-04 08:27:37.563514] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:45.098 INFO: Running with entropic power schedule (0xFF, 100). 00:09:45.098 INFO: Seed: 1503090388 00:09:45.098 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:45.098 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:45.098 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:45.098 INFO: A corpus is not provided, starting from an empty corpus 00:09:45.098 #2 INITED exec/s: 0 rss: 60Mb 00:09:45.098 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:45.098 This may also happen if the target rejected all inputs we tried so far 00:09:45.357 [2024-10-04 08:27:37.849521] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:45.617 NEW_FUNC[1/634]: 0x451d68 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:09:45.617 NEW_FUNC[2/634]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:45.617 #13 NEW cov: 10750 ft: 10722 corp: 2/13b lim: 80 exec/s: 0 rss: 65Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:09:45.876 [2024-10-04 08:27:38.304018] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:45.876 NEW_FUNC[1/2]: 0x15ed068 in _is_io_flags_valid /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:141 00:09:45.876 NEW_FUNC[2/2]: 0x1609fa8 in _nvme_md_excluded_from_xfer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:54 00:09:45.876 #21 NEW cov: 10772 ft: 13267 corp: 3/28b lim: 80 exec/s: 0 rss: 67Mb L: 15/15 MS: 3 EraseBytes-ChangeByte-CMP- DE: "\000p\201\023\000 \000\000"- 00:09:45.876 [2024-10-04 08:27:38.487973] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:46.135 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:46.135 #22 NEW cov: 10789 ft: 14927 corp: 4/44b lim: 80 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 CrossOver- 00:09:46.135 [2024-10-04 08:27:38.670599] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:46.135 #23 NEW cov: 10789 ft: 15446 corp: 5/56b lim: 80 exec/s: 23 rss: 68Mb L: 12/16 MS: 1 PersAutoDict- DE: "\000p\201\023\000 \000\000"- 00:09:46.395 [2024-10-04 08:27:38.855545] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:46.395 [2024-10-04 08:27:38.855581] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:46.395 NEW_FUNC[1/2]: 0x1330b08 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:09:46.395 NEW_FUNC[2/2]: 0x1330da8 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:09:46.395 #24 NEW cov: 10802 ft: 15619 corp: 6/87b lim: 80 exec/s: 24 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:09:46.395 [2024-10-04 08:27:39.048898] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:46.654 #28 NEW cov: 10802 ft: 15882 corp: 7/153b lim: 80 exec/s: 28 rss: 68Mb L: 66/66 MS: 4 ChangeByte-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:09:46.654 [2024-10-04 08:27:39.233793] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:46.912 #29 NEW cov: 10802 ft: 16238 corp: 8/179b lim: 80 exec/s: 29 rss: 68Mb L: 26/66 MS: 1 CrossOver- 00:09:46.912 [2024-10-04 08:27:39.416929] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:46.912 #30 NEW cov: 10802 ft: 16407 corp: 9/245b lim: 80 exec/s: 30 rss: 68Mb L: 66/66 MS: 1 ChangeBit- 00:09:47.171 [2024-10-04 08:27:39.600583] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:47.171 #31 NEW cov: 10809 ft: 16576 corp: 10/257b lim: 80 exec/s: 31 rss: 68Mb L: 12/66 MS: 1 ChangeBinInt- 00:09:47.171 [2024-10-04 08:27:39.782570] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:47.429 #32 pulse cov: 10809 ft: 16676 corp: 10/257b lim: 80 exec/s: 16 rss: 68Mb 00:09:47.429 #32 NEW cov: 10809 ft: 16676 corp: 11/323b lim: 80 exec/s: 16 rss: 68Mb L: 66/66 MS: 1 ChangeByte- 00:09:47.429 #32 DONE cov: 10809 ft: 16676 corp: 11/323b lim: 80 exec/s: 16 rss: 68Mb 00:09:47.429 ###### Recommended dictionary. ###### 00:09:47.429 "\000p\201\023\000 \000\000" # Uses: 1 00:09:47.429 ###### End of recommended dictionary. ###### 00:09:47.429 Done 32 runs in 2 second(s) 00:09:47.688 08:27:40 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:09:47.688 08:27:40 -- ../common.sh@72 -- # (( i++ )) 00:09:47.688 08:27:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:47.688 08:27:40 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:47.688 08:27:40 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:47.688 08:27:40 -- vfio/run.sh@23 -- # local timen=1 00:09:47.688 08:27:40 -- vfio/run.sh@24 -- # local core=0x1 00:09:47.688 08:27:40 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:47.688 08:27:40 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:47.688 08:27:40 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:47.688 08:27:40 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:47.688 08:27:40 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:47.688 08:27:40 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:47.688 08:27:40 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:47.688 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:47.688 08:27:40 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:47.688 [2024-10-04 08:27:40.190083] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:47.688 [2024-10-04 08:27:40.190156] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1025213 ] 00:09:47.688 EAL: No free 2048 kB hugepages reported on node 1 00:09:47.688 [2024-10-04 08:27:40.263497] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:47.688 [2024-10-04 08:27:40.299898] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:47.688 [2024-10-04 08:27:40.300053] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:47.946 INFO: Running with entropic power schedule (0xFF, 100). 00:09:47.946 INFO: Seed: 4245076564 00:09:47.946 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:47.946 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:47.946 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:47.946 INFO: A corpus is not provided, starting from an empty corpus 00:09:47.946 #2 INITED exec/s: 0 rss: 60Mb 00:09:47.946 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:47.946 This may also happen if the target rejected all inputs we tried so far 00:09:47.946 [2024-10-04 08:27:40.589276] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:09:47.946 [2024-10-04 08:27:40.589314] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:47.946 [2024-10-04 08:27:40.589325] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:47.946 [2024-10-04 08:27:40.589351] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:48.462 NEW_FUNC[1/638]: 0x452458 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:09:48.462 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:48.462 #37 NEW cov: 10740 ft: 10614 corp: 2/116b lim: 320 exec/s: 0 rss: 65Mb L: 115/115 MS: 5 InsertByte-ChangeByte-EraseBytes-CopyPart-InsertRepeatedBytes- 00:09:48.462 [2024-10-04 08:27:41.028378] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:48.462 [2024-10-04 08:27:41.028414] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:48.462 [2024-10-04 08:27:41.028425] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:48.462 [2024-10-04 08:27:41.028442] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:48.462 #43 NEW cov: 10790 ft: 13774 corp: 3/231b lim: 320 exec/s: 0 rss: 67Mb L: 115/115 MS: 1 ChangeBit- 00:09:48.720 [2024-10-04 08:27:41.199633] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x800000000, 0x800000000) fd=325 offset=0 prot=0x3: Invalid argument 00:09:48.720 [2024-10-04 08:27:41.199656] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x800000000, 0x800000000) offset=0 flags=0x3: Invalid argument 00:09:48.720 [2024-10-04 08:27:41.199666] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:48.720 [2024-10-04 08:27:41.199683] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:48.720 #44 NEW cov: 10790 ft: 13995 corp: 4/346b lim: 320 exec/s: 0 rss: 68Mb L: 115/115 MS: 1 ChangeBit- 00:09:48.720 [2024-10-04 08:27:41.370072] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x17000000000000 prot=0x3: Invalid argument 00:09:48.720 [2024-10-04 08:27:41.370096] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x17000000000000 flags=0x3: Invalid argument 00:09:48.720 [2024-10-04 08:27:41.370107] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:48.720 [2024-10-04 08:27:41.370124] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:48.978 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:48.978 #45 NEW cov: 10807 ft: 14300 corp: 5/462b lim: 320 exec/s: 0 rss: 68Mb L: 116/116 MS: 1 InsertByte- 00:09:48.978 [2024-10-04 08:27:41.540166] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x800000000, 0x800000000) fd=325 offset=0 prot=0x3: Invalid argument 00:09:48.978 [2024-10-04 08:27:41.540198] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x800000000, 0x800000000) offset=0 flags=0x3: Invalid argument 00:09:48.978 [2024-10-04 08:27:41.540213] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:48.978 [2024-10-04 08:27:41.540230] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:48.978 #46 NEW cov: 10807 ft: 15179 corp: 6/577b lim: 320 exec/s: 46 rss: 68Mb L: 115/116 MS: 1 ChangeBinInt- 00:09:49.235 [2024-10-04 08:27:41.712575] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:49.236 [2024-10-04 08:27:41.712605] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:49.236 [2024-10-04 08:27:41.712615] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:49.236 [2024-10-04 08:27:41.712631] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:49.236 #47 NEW cov: 10807 ft: 15518 corp: 7/692b lim: 320 exec/s: 47 rss: 68Mb L: 115/116 MS: 1 ChangeByte- 00:09:49.236 [2024-10-04 08:27:41.883303] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x800000000, 0x800000000) fd=325 offset=0 prot=0x3: Invalid argument 00:09:49.236 [2024-10-04 08:27:41.883327] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x800000000, 0x800000000) offset=0 flags=0x3: Invalid argument 00:09:49.236 [2024-10-04 08:27:41.883338] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:49.236 [2024-10-04 08:27:41.883355] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:49.494 #48 NEW cov: 10807 ft: 16138 corp: 8/807b lim: 320 exec/s: 48 rss: 68Mb L: 115/116 MS: 1 CrossOver- 00:09:49.494 [2024-10-04 08:27:42.053407] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:49.494 [2024-10-04 08:27:42.053430] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:49.494 [2024-10-04 08:27:42.053440] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:49.494 [2024-10-04 08:27:42.053455] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:49.494 #49 NEW cov: 10807 ft: 16264 corp: 9/940b lim: 320 exec/s: 49 rss: 68Mb L: 133/133 MS: 1 InsertRepeatedBytes- 00:09:49.753 [2024-10-04 08:27:42.224859] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:49.753 [2024-10-04 08:27:42.224883] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:49.753 [2024-10-04 08:27:42.224893] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:49.753 [2024-10-04 08:27:42.224909] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:49.753 #50 NEW cov: 10814 ft: 16341 corp: 10/1073b lim: 320 exec/s: 50 rss: 68Mb L: 133/133 MS: 1 ShuffleBytes- 00:09:49.753 [2024-10-04 08:27:42.396100] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x800000000, 0x800000000) fd=325 offset=0 prot=0x3: Invalid argument 00:09:49.753 [2024-10-04 08:27:42.396124] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x800000000, 0x800000000) offset=0 flags=0x3: Invalid argument 00:09:49.754 [2024-10-04 08:27:42.396135] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:49.754 [2024-10-04 08:27:42.396151] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:50.012 #51 NEW cov: 10814 ft: 16367 corp: 11/1244b lim: 320 exec/s: 51 rss: 68Mb L: 171/171 MS: 1 InsertRepeatedBytes- 00:09:50.012 [2024-10-04 08:27:42.565936] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x800000000, 0x800000000) fd=325 offset=0 prot=0x3: Invalid argument 00:09:50.012 [2024-10-04 08:27:42.565963] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x800000000, 0x800000000) offset=0 flags=0x3: Invalid argument 00:09:50.012 [2024-10-04 08:27:42.565973] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:50.012 [2024-10-04 08:27:42.565989] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:50.012 #52 NEW cov: 10814 ft: 16380 corp: 12/1360b lim: 320 exec/s: 26 rss: 68Mb L: 116/171 MS: 1 InsertByte- 00:09:50.012 #52 DONE cov: 10814 ft: 16380 corp: 12/1360b lim: 320 exec/s: 26 rss: 68Mb 00:09:50.012 Done 52 runs in 2 second(s) 00:09:50.271 08:27:42 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:09:50.271 08:27:42 -- ../common.sh@72 -- # (( i++ )) 00:09:50.271 08:27:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:50.271 08:27:42 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:50.271 08:27:42 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:50.271 08:27:42 -- vfio/run.sh@23 -- # local timen=1 00:09:50.271 08:27:42 -- vfio/run.sh@24 -- # local core=0x1 00:09:50.271 08:27:42 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:50.271 08:27:42 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:50.271 08:27:42 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:50.271 08:27:42 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:50.271 08:27:42 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:50.271 08:27:42 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:50.271 08:27:42 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:50.271 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:50.271 08:27:42 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:50.531 [2024-10-04 08:27:42.962414] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:50.531 [2024-10-04 08:27:42.962507] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1025760 ] 00:09:50.531 EAL: No free 2048 kB hugepages reported on node 1 00:09:50.531 [2024-10-04 08:27:43.033571] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:50.531 [2024-10-04 08:27:43.068474] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:50.531 [2024-10-04 08:27:43.068622] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:50.790 INFO: Running with entropic power schedule (0xFF, 100). 00:09:50.790 INFO: Seed: 2711094072 00:09:50.790 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:50.790 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:50.790 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:50.790 INFO: A corpus is not provided, starting from an empty corpus 00:09:50.790 #2 INITED exec/s: 0 rss: 60Mb 00:09:50.790 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:50.790 This may also happen if the target rejected all inputs we tried so far 00:09:50.790 [2024-10-04 08:27:43.328231] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:09:50.790 [2024-10-04 08:27:43.328268] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:50.790 [2024-10-04 08:27:43.328282] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:50.790 [2024-10-04 08:27:43.328299] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:50.790 [2024-10-04 08:27:43.329211] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:09:50.790 [2024-10-04 08:27:43.329225] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:50.790 [2024-10-04 08:27:43.329241] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:51.050 NEW_FUNC[1/638]: 0x452cd8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:09:51.050 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:51.050 #6 NEW cov: 10781 ft: 10339 corp: 2/75b lim: 320 exec/s: 0 rss: 65Mb L: 74/74 MS: 4 ChangeByte-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:09:51.309 [2024-10-04 08:27:43.733205] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:51.309 [2024-10-04 08:27:43.733241] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:51.309 [2024-10-04 08:27:43.733252] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:51.309 [2024-10-04 08:27:43.733271] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:51.309 [2024-10-04 08:27:43.734210] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:09:51.309 [2024-10-04 08:27:43.734235] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:51.309 [2024-10-04 08:27:43.734253] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:51.309 #12 NEW cov: 10798 ft: 13020 corp: 3/149b lim: 320 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 ShuffleBytes- 00:09:51.309 #13 NEW cov: 10802 ft: 14171 corp: 4/223b lim: 320 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 ChangeBit- 00:09:51.309 [2024-10-04 08:27:43.960825] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x4 prot=0x3: Invalid argument 00:09:51.309 [2024-10-04 08:27:43.960853] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0x4 flags=0x3: Invalid argument 00:09:51.309 [2024-10-04 08:27:43.960864] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:51.309 [2024-10-04 08:27:43.960883] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:51.309 [2024-10-04 08:27:43.961837] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:09:51.309 [2024-10-04 08:27:43.961861] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:51.309 [2024-10-04 08:27:43.961878] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:51.569 #14 NEW cov: 10802 ft: 14732 corp: 5/297b lim: 320 exec/s: 0 rss: 68Mb L: 74/74 MS: 1 ChangeBit- 00:09:51.569 [2024-10-04 08:27:44.075975] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 18446744073709551615 > max 8796093022208 00:09:51.569 [2024-10-04 08:27:44.076001] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0xffffffffffffffff, 0xfffffffffffffffe) offset=0xffffffffffffffff flags=0x3: No space left on device 00:09:51.569 [2024-10-04 08:27:44.076012] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:09:51.569 [2024-10-04 08:27:44.076029] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:51.569 [2024-10-04 08:27:44.076987] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0xffffffffffffffff, 0xfffffffffffffffe) flags=0: No such file or directory 00:09:51.569 [2024-10-04 08:27:44.077012] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:51.569 [2024-10-04 08:27:44.077031] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:51.569 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:51.569 #19 NEW cov: 10819 ft: 14950 corp: 6/375b lim: 320 exec/s: 0 rss: 68Mb L: 78/78 MS: 5 ShuffleBytes-ShuffleBytes-CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:09:51.828 #25 NEW cov: 10819 ft: 15223 corp: 7/449b lim: 320 exec/s: 25 rss: 68Mb L: 74/78 MS: 1 ChangeBinInt- 00:09:51.828 #26 NEW cov: 10819 ft: 15803 corp: 8/523b lim: 320 exec/s: 26 rss: 68Mb L: 74/78 MS: 1 ChangeBit- 00:09:51.828 [2024-10-04 08:27:44.424405] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x4 prot=0x3: Invalid argument 00:09:51.828 [2024-10-04 08:27:44.424430] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0x4 flags=0x3: Invalid argument 00:09:51.828 [2024-10-04 08:27:44.424440] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:51.828 [2024-10-04 08:27:44.424473] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:51.828 [2024-10-04 08:27:44.425429] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:09:51.828 [2024-10-04 08:27:44.425450] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:51.828 [2024-10-04 08:27:44.425468] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:51.828 #32 NEW cov: 10819 ft: 16003 corp: 9/597b lim: 320 exec/s: 32 rss: 68Mb L: 74/78 MS: 1 CrossOver- 00:09:52.088 [2024-10-04 08:27:44.539204] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 18446744073709551615 > max 8796093022208 00:09:52.088 [2024-10-04 08:27:44.539228] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0xffffffffffffffff, 0xfffffffffffffffe) offset=0xffffffffffffffff flags=0x3: No space left on device 00:09:52.088 [2024-10-04 08:27:44.539239] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:09:52.088 [2024-10-04 08:27:44.539272] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:52.088 [2024-10-04 08:27:44.540230] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0xffffffffffffffff, 0xfffffffffffffffe) flags=0: No such file or directory 00:09:52.088 [2024-10-04 08:27:44.540251] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:52.088 [2024-10-04 08:27:44.540268] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:52.088 #33 NEW cov: 10819 ft: 16518 corp: 10/675b lim: 320 exec/s: 33 rss: 68Mb L: 78/78 MS: 1 ShuffleBytes- 00:09:52.088 [2024-10-04 08:27:44.654006] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x4 prot=0x3: Invalid argument 00:09:52.088 [2024-10-04 08:27:44.654029] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0x4 flags=0x3: Invalid argument 00:09:52.088 [2024-10-04 08:27:44.654039] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:52.088 [2024-10-04 08:27:44.654072] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:52.088 [2024-10-04 08:27:44.655052] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:09:52.088 [2024-10-04 08:27:44.655074] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:52.088 [2024-10-04 08:27:44.655091] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:52.088 #39 NEW cov: 10819 ft: 16680 corp: 11/793b lim: 320 exec/s: 39 rss: 68Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:09:52.347 #42 NEW cov: 10819 ft: 16726 corp: 12/909b lim: 320 exec/s: 42 rss: 68Mb L: 116/118 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:09:52.347 #43 NEW cov: 10819 ft: 16787 corp: 13/1026b lim: 320 exec/s: 43 rss: 68Mb L: 117/118 MS: 1 InsertByte- 00:09:52.607 #44 NEW cov: 10819 ft: 17009 corp: 14/1100b lim: 320 exec/s: 44 rss: 68Mb L: 74/118 MS: 1 ChangeBit- 00:09:52.607 [2024-10-04 08:27:45.118432] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:52.607 [2024-10-04 08:27:45.118458] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:52.607 [2024-10-04 08:27:45.118468] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:52.607 [2024-10-04 08:27:45.118501] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:52.607 [2024-10-04 08:27:45.119451] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:09:52.607 [2024-10-04 08:27:45.119472] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:52.607 [2024-10-04 08:27:45.119489] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:52.607 #45 NEW cov: 10826 ft: 17090 corp: 15/1182b lim: 320 exec/s: 45 rss: 68Mb L: 82/118 MS: 1 CMP- DE: "\001\000\000\000\005,|I"- 00:09:52.607 [2024-10-04 08:27:45.233216] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:52.607 [2024-10-04 08:27:45.233241] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:52.607 [2024-10-04 08:27:45.233251] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:52.607 [2024-10-04 08:27:45.233284] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:52.607 [2024-10-04 08:27:45.234266] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:09:52.607 [2024-10-04 08:27:45.234288] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:09:52.607 [2024-10-04 08:27:45.234306] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:09:52.866 #47 NEW cov: 10826 ft: 17322 corp: 16/1282b lim: 320 exec/s: 23 rss: 68Mb L: 100/118 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:52.866 #47 DONE cov: 10826 ft: 17322 corp: 16/1282b lim: 320 exec/s: 23 rss: 68Mb 00:09:52.866 ###### Recommended dictionary. ###### 00:09:52.866 "\001\000\000\000\005,|I" # Uses: 0 00:09:52.866 ###### End of recommended dictionary. ###### 00:09:52.866 Done 47 runs in 2 second(s) 00:09:53.125 08:27:45 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:09:53.125 08:27:45 -- ../common.sh@72 -- # (( i++ )) 00:09:53.125 08:27:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:53.125 08:27:45 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:53.125 08:27:45 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:53.125 08:27:45 -- vfio/run.sh@23 -- # local timen=1 00:09:53.125 08:27:45 -- vfio/run.sh@24 -- # local core=0x1 00:09:53.125 08:27:45 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:53.125 08:27:45 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:53.125 08:27:45 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:53.125 08:27:45 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:53.125 08:27:45 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:53.125 08:27:45 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:53.125 08:27:45 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:53.125 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:53.125 08:27:45 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:53.125 [2024-10-04 08:27:45.597252] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:53.125 [2024-10-04 08:27:45.597323] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1026299 ] 00:09:53.125 EAL: No free 2048 kB hugepages reported on node 1 00:09:53.125 [2024-10-04 08:27:45.667212] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:53.125 [2024-10-04 08:27:45.702161] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:53.125 [2024-10-04 08:27:45.702336] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:53.383 INFO: Running with entropic power schedule (0xFF, 100). 00:09:53.383 INFO: Seed: 1044144891 00:09:53.383 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:53.383 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:53.383 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:53.383 INFO: A corpus is not provided, starting from an empty corpus 00:09:53.383 #2 INITED exec/s: 0 rss: 60Mb 00:09:53.383 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:53.383 This may also happen if the target rejected all inputs we tried so far 00:09:53.383 [2024-10-04 08:27:45.985218] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:53.383 [2024-10-04 08:27:45.985277] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:53.899 NEW_FUNC[1/636]: 0x4536d8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:09:53.899 NEW_FUNC[2/636]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:53.899 #25 NEW cov: 10744 ft: 10749 corp: 2/71b lim: 120 exec/s: 0 rss: 65Mb L: 70/70 MS: 3 ChangeBit-InsertRepeatedBytes-InsertRepeatedBytes- 00:09:53.899 [2024-10-04 08:27:46.431117] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:53.899 [2024-10-04 08:27:46.431171] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:53.899 NEW_FUNC[1/2]: 0x1371288 in _nvmf_vfio_user_req_free /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:5277 00:09:53.899 NEW_FUNC[2/2]: 0x1c6d5f8 in spdk_io_channel_get_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:2468 00:09:53.899 #31 NEW cov: 10795 ft: 14359 corp: 3/141b lim: 120 exec/s: 0 rss: 66Mb L: 70/70 MS: 1 ChangeByte- 00:09:54.158 [2024-10-04 08:27:46.603383] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.158 [2024-10-04 08:27:46.603413] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.158 #37 NEW cov: 10798 ft: 15950 corp: 4/211b lim: 120 exec/s: 0 rss: 67Mb L: 70/70 MS: 1 CrossOver- 00:09:54.158 [2024-10-04 08:27:46.763427] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.158 [2024-10-04 08:27:46.763457] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.417 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:54.417 #38 NEW cov: 10815 ft: 16128 corp: 5/281b lim: 120 exec/s: 0 rss: 67Mb L: 70/70 MS: 1 CopyPart- 00:09:54.417 [2024-10-04 08:27:46.924147] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.418 [2024-10-04 08:27:46.924178] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.418 #39 NEW cov: 10815 ft: 16525 corp: 6/355b lim: 120 exec/s: 39 rss: 67Mb L: 74/74 MS: 1 InsertRepeatedBytes- 00:09:54.418 [2024-10-04 08:27:47.084524] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.418 [2024-10-04 08:27:47.084555] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.676 #40 NEW cov: 10815 ft: 16590 corp: 7/425b lim: 120 exec/s: 40 rss: 67Mb L: 70/74 MS: 1 ChangeByte- 00:09:54.676 [2024-10-04 08:27:47.246484] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.676 [2024-10-04 08:27:47.246513] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.676 #41 NEW cov: 10815 ft: 16799 corp: 8/495b lim: 120 exec/s: 41 rss: 67Mb L: 70/74 MS: 1 CrossOver- 00:09:54.935 [2024-10-04 08:27:47.408126] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.935 [2024-10-04 08:27:47.408154] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:54.935 #42 NEW cov: 10815 ft: 17177 corp: 9/565b lim: 120 exec/s: 42 rss: 67Mb L: 70/74 MS: 1 ChangeByte- 00:09:54.935 [2024-10-04 08:27:47.569328] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:54.935 [2024-10-04 08:27:47.569356] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:55.193 #43 NEW cov: 10815 ft: 17259 corp: 10/660b lim: 120 exec/s: 43 rss: 67Mb L: 95/95 MS: 1 CopyPart- 00:09:55.193 [2024-10-04 08:27:47.730271] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:55.193 [2024-10-04 08:27:47.730301] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:55.193 #47 NEW cov: 10822 ft: 17521 corp: 11/697b lim: 120 exec/s: 47 rss: 67Mb L: 37/95 MS: 4 ChangeByte-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:09:55.452 [2024-10-04 08:27:47.900227] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:55.452 [2024-10-04 08:27:47.900272] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:55.452 #48 NEW cov: 10822 ft: 17525 corp: 12/810b lim: 120 exec/s: 24 rss: 67Mb L: 113/113 MS: 1 InsertRepeatedBytes- 00:09:55.452 #48 DONE cov: 10822 ft: 17525 corp: 12/810b lim: 120 exec/s: 24 rss: 67Mb 00:09:55.452 Done 48 runs in 2 second(s) 00:09:55.711 08:27:48 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:09:55.711 08:27:48 -- ../common.sh@72 -- # (( i++ )) 00:09:55.711 08:27:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:55.711 08:27:48 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:55.711 08:27:48 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:55.711 08:27:48 -- vfio/run.sh@23 -- # local timen=1 00:09:55.711 08:27:48 -- vfio/run.sh@24 -- # local core=0x1 00:09:55.711 08:27:48 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:55.711 08:27:48 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:55.711 08:27:48 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:55.711 08:27:48 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:55.711 08:27:48 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:55.711 08:27:48 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:55.711 08:27:48 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:55.711 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:55.711 08:27:48 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:55.711 [2024-10-04 08:27:48.295527] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:09:55.712 [2024-10-04 08:27:48.295623] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1026717 ] 00:09:55.712 EAL: No free 2048 kB hugepages reported on node 1 00:09:55.712 [2024-10-04 08:27:48.367863] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:55.970 [2024-10-04 08:27:48.404107] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:55.970 [2024-10-04 08:27:48.404263] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:55.970 INFO: Running with entropic power schedule (0xFF, 100). 00:09:55.970 INFO: Seed: 3758138886 00:09:55.970 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:09:55.970 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:09:55.971 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:55.971 INFO: A corpus is not provided, starting from an empty corpus 00:09:55.971 #2 INITED exec/s: 0 rss: 60Mb 00:09:55.971 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:55.971 This may also happen if the target rejected all inputs we tried so far 00:09:56.229 [2024-10-04 08:27:48.666268] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.229 [2024-10-04 08:27:48.666311] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.488 NEW_FUNC[1/638]: 0x4543c8 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:56.488 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:56.488 #5 NEW cov: 10773 ft: 10406 corp: 2/58b lim: 90 exec/s: 0 rss: 65Mb L: 57/57 MS: 3 ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:09:56.488 [2024-10-04 08:27:49.070993] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.488 [2024-10-04 08:27:49.071038] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.488 #6 NEW cov: 10790 ft: 13135 corp: 3/116b lim: 90 exec/s: 0 rss: 67Mb L: 58/58 MS: 1 InsertByte- 00:09:56.747 [2024-10-04 08:27:49.185748] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.747 [2024-10-04 08:27:49.185785] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.747 #7 NEW cov: 10790 ft: 14454 corp: 4/201b lim: 90 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 CopyPart- 00:09:56.747 [2024-10-04 08:27:49.299527] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.747 [2024-10-04 08:27:49.299565] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:56.747 #8 NEW cov: 10790 ft: 14777 corp: 5/270b lim: 90 exec/s: 0 rss: 68Mb L: 69/85 MS: 1 InsertRepeatedBytes- 00:09:56.747 [2024-10-04 08:27:49.423462] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:56.747 [2024-10-04 08:27:49.423498] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.005 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:57.005 #9 NEW cov: 10807 ft: 14871 corp: 6/355b lim: 90 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 CrossOver- 00:09:57.005 [2024-10-04 08:27:49.538278] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.005 [2024-10-04 08:27:49.538315] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.005 #10 NEW cov: 10807 ft: 15022 corp: 7/438b lim: 90 exec/s: 10 rss: 68Mb L: 83/85 MS: 1 InsertRepeatedBytes- 00:09:57.005 [2024-10-04 08:27:49.651970] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.005 [2024-10-04 08:27:49.652005] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.264 #11 NEW cov: 10807 ft: 15477 corp: 8/496b lim: 90 exec/s: 11 rss: 68Mb L: 58/85 MS: 1 InsertByte- 00:09:57.264 [2024-10-04 08:27:49.766801] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.264 [2024-10-04 08:27:49.766835] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.264 #12 NEW cov: 10807 ft: 15585 corp: 9/570b lim: 90 exec/s: 12 rss: 68Mb L: 74/85 MS: 1 CopyPart- 00:09:57.264 [2024-10-04 08:27:49.881594] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.264 [2024-10-04 08:27:49.881629] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.522 #13 NEW cov: 10807 ft: 15753 corp: 10/648b lim: 90 exec/s: 13 rss: 68Mb L: 78/85 MS: 1 CopyPart- 00:09:57.522 [2024-10-04 08:27:49.995459] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.522 [2024-10-04 08:27:49.995494] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.522 #14 NEW cov: 10807 ft: 16001 corp: 11/707b lim: 90 exec/s: 14 rss: 68Mb L: 59/85 MS: 1 InsertByte- 00:09:57.522 [2024-10-04 08:27:50.112275] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.522 [2024-10-04 08:27:50.112313] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.522 #15 NEW cov: 10807 ft: 16046 corp: 12/794b lim: 90 exec/s: 15 rss: 68Mb L: 87/87 MS: 1 CopyPart- 00:09:57.781 [2024-10-04 08:27:50.228139] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.781 [2024-10-04 08:27:50.228175] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.781 #18 NEW cov: 10807 ft: 16106 corp: 13/813b lim: 90 exec/s: 18 rss: 68Mb L: 19/87 MS: 3 InsertByte-CopyPart-CrossOver- 00:09:57.781 [2024-10-04 08:27:50.342848] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.781 [2024-10-04 08:27:50.342882] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:57.781 #19 NEW cov: 10814 ft: 16384 corp: 14/898b lim: 90 exec/s: 19 rss: 68Mb L: 85/87 MS: 1 ChangeByte- 00:09:57.781 [2024-10-04 08:27:50.457663] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:57.781 [2024-10-04 08:27:50.457697] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:58.039 #20 NEW cov: 10814 ft: 16637 corp: 15/959b lim: 90 exec/s: 20 rss: 68Mb L: 61/87 MS: 1 CMP- DE: "\000\000\000\000"- 00:09:58.039 [2024-10-04 08:27:50.572505] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:58.039 [2024-10-04 08:27:50.572540] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:58.039 #21 NEW cov: 10814 ft: 16837 corp: 16/1044b lim: 90 exec/s: 10 rss: 68Mb L: 85/87 MS: 1 CopyPart- 00:09:58.039 #21 DONE cov: 10814 ft: 16837 corp: 16/1044b lim: 90 exec/s: 10 rss: 68Mb 00:09:58.039 ###### Recommended dictionary. ###### 00:09:58.039 "\000\000\000\000" # Uses: 0 00:09:58.039 ###### End of recommended dictionary. ###### 00:09:58.039 Done 21 runs in 2 second(s) 00:09:58.298 08:27:50 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:09:58.298 08:27:50 -- ../common.sh@72 -- # (( i++ )) 00:09:58.298 08:27:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:58.298 08:27:50 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:09:58.298 00:09:58.298 real 0m19.018s 00:09:58.298 user 0m26.049s 00:09:58.298 sys 0m1.854s 00:09:58.298 08:27:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:58.298 08:27:50 -- common/autotest_common.sh@10 -- # set +x 00:09:58.298 ************************************ 00:09:58.298 END TEST vfio_fuzz 00:09:58.298 ************************************ 00:09:58.298 08:27:50 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:09:58.298 00:09:58.298 real 1m21.593s 00:09:58.298 user 2m5.324s 00:09:58.298 sys 0m9.032s 00:09:58.299 08:27:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:58.299 08:27:50 -- common/autotest_common.sh@10 -- # set +x 00:09:58.299 ************************************ 00:09:58.299 END TEST llvm_fuzz 00:09:58.299 ************************************ 00:09:58.558 08:27:50 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:09:58.558 08:27:50 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:09:58.558 08:27:50 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:09:58.558 08:27:50 -- common/autotest_common.sh@712 -- # xtrace_disable 00:09:58.558 08:27:50 -- common/autotest_common.sh@10 -- # set +x 00:09:58.558 08:27:50 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:09:58.558 08:27:50 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:09:58.558 08:27:50 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:09:58.558 08:27:50 -- common/autotest_common.sh@10 -- # set +x 00:10:05.129 INFO: APP EXITING 00:10:05.129 INFO: killing all VMs 00:10:05.129 INFO: killing vhost app 00:10:05.129 INFO: EXIT DONE 00:10:07.669 Waiting for block devices as requested 00:10:07.669 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:07.669 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:07.669 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:07.669 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:07.669 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:07.669 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:07.669 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:07.929 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:07.929 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:07.929 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:07.929 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:08.189 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:08.189 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:08.189 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:08.448 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:08.448 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:08.708 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:10:12.907 Cleaning 00:10:12.907 Removing: /dev/shm/spdk_tgt_trace.pid990359 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1000412 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1000694 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1000855 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1001052 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1001358 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1001379 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1001482 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1001711 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1001994 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1002261 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1002544 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1002729 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1002904 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1003123 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1003404 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1003678 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1003973 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1004187 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1004348 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1004553 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1004837 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1005103 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1005392 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1005605 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1005773 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1005964 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1006253 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1006524 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1006805 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1007029 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1007197 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1007381 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1007662 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1007928 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1008225 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1008452 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1008616 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1008794 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1009083 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1009352 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1009639 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1009915 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1010136 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1010282 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1010508 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1010782 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1011064 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1011138 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1011460 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1012004 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1012471 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1013008 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1013308 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1013840 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1014272 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1014674 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1015211 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1015563 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1016044 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1016592 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1016916 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1017423 00:10:12.907 Removing: /var/run/dpdk/spdk_pid1018022 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1018646 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1019353 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1019710 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1020184 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1020703 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1021011 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1021554 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1021958 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1022396 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1022930 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1023220 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1023827 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1024376 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1024770 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1025213 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1025760 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1026299 00:10:12.908 Removing: /var/run/dpdk/spdk_pid1026717 00:10:12.908 Removing: /var/run/dpdk/spdk_pid987883 00:10:12.908 Removing: /var/run/dpdk/spdk_pid989159 00:10:12.908 Removing: /var/run/dpdk/spdk_pid990359 00:10:12.908 Removing: /var/run/dpdk/spdk_pid991014 00:10:12.908 Removing: /var/run/dpdk/spdk_pid991338 00:10:12.908 Removing: /var/run/dpdk/spdk_pid991659 00:10:12.908 Removing: /var/run/dpdk/spdk_pid992015 00:10:12.908 Removing: /var/run/dpdk/spdk_pid992298 00:10:12.908 Removing: /var/run/dpdk/spdk_pid992472 00:10:12.908 Removing: /var/run/dpdk/spdk_pid992757 00:10:12.908 Removing: /var/run/dpdk/spdk_pid993072 00:10:12.908 Removing: /var/run/dpdk/spdk_pid993929 00:10:12.908 Removing: /var/run/dpdk/spdk_pid997005 00:10:12.908 Removing: /var/run/dpdk/spdk_pid997338 00:10:12.908 Removing: /var/run/dpdk/spdk_pid997641 00:10:12.908 Removing: /var/run/dpdk/spdk_pid997741 00:10:12.908 Removing: /var/run/dpdk/spdk_pid998318 00:10:12.908 Removing: /var/run/dpdk/spdk_pid998540 00:10:12.908 Removing: /var/run/dpdk/spdk_pid998898 00:10:12.908 Removing: /var/run/dpdk/spdk_pid999167 00:10:12.908 Removing: /var/run/dpdk/spdk_pid999436 00:10:12.908 Removing: /var/run/dpdk/spdk_pid999482 00:10:12.908 Removing: /var/run/dpdk/spdk_pid999772 00:10:12.908 Removing: /var/run/dpdk/spdk_pid999882 00:10:12.908 Clean 00:10:12.908 killing process with pid 943423 00:10:16.355 killing process with pid 943420 00:10:16.355 killing process with pid 943422 00:10:16.355 killing process with pid 943421 00:10:16.355 08:28:08 -- common/autotest_common.sh@1436 -- # return 0 00:10:16.355 08:28:08 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:10:16.355 08:28:08 -- common/autotest_common.sh@718 -- # xtrace_disable 00:10:16.355 08:28:08 -- common/autotest_common.sh@10 -- # set +x 00:10:16.355 08:28:08 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:10:16.355 08:28:08 -- common/autotest_common.sh@718 -- # xtrace_disable 00:10:16.355 08:28:08 -- common/autotest_common.sh@10 -- # set +x 00:10:16.355 08:28:08 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:16.355 08:28:08 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:10:16.355 08:28:08 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:10:16.355 08:28:08 -- spdk/autotest.sh@394 -- # hash lcov 00:10:16.355 08:28:08 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:10:16.615 08:28:09 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:16.615 08:28:09 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:10:16.615 08:28:09 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:16.615 08:28:09 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:16.615 08:28:09 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.615 08:28:09 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.615 08:28:09 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.615 08:28:09 -- paths/export.sh@5 -- $ export PATH 00:10:16.615 08:28:09 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.615 08:28:09 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:10:16.615 08:28:09 -- common/autobuild_common.sh@440 -- $ date +%s 00:10:16.615 08:28:09 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1728023289.XXXXXX 00:10:16.615 08:28:09 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1728023289.gEJQMB 00:10:16.615 08:28:09 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:10:16.615 08:28:09 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:10:16.615 08:28:09 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:16.615 08:28:09 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:10:16.615 08:28:09 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:10:16.615 08:28:09 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:10:16.615 08:28:09 -- common/autobuild_common.sh@456 -- $ get_config_params 00:10:16.615 08:28:09 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:10:16.615 08:28:09 -- common/autotest_common.sh@10 -- $ set +x 00:10:16.615 08:28:09 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:10:16.615 08:28:09 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:10:16.615 08:28:09 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:16.615 08:28:09 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:10:16.615 08:28:09 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:10:16.615 08:28:09 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:10:16.615 08:28:09 -- spdk/autopackage.sh@19 -- $ timing_finish 00:10:16.615 08:28:09 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:16.615 08:28:09 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:10:16.615 08:28:09 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:16.615 08:28:09 -- spdk/autopackage.sh@20 -- $ exit 0 00:10:16.615 + [[ -n 887538 ]] 00:10:16.615 + sudo kill 887538 00:10:16.624 [Pipeline] } 00:10:16.641 [Pipeline] // stage 00:10:16.646 [Pipeline] } 00:10:16.661 [Pipeline] // timeout 00:10:16.668 [Pipeline] } 00:10:16.685 [Pipeline] // catchError 00:10:16.691 [Pipeline] } 00:10:16.709 [Pipeline] // wrap 00:10:16.716 [Pipeline] } 00:10:16.732 [Pipeline] // catchError 00:10:16.743 [Pipeline] stage 00:10:16.746 [Pipeline] { (Epilogue) 00:10:16.762 [Pipeline] catchError 00:10:16.764 [Pipeline] { 00:10:16.780 [Pipeline] echo 00:10:16.782 Cleanup processes 00:10:16.791 [Pipeline] sh 00:10:17.082 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:17.082 943468 tee /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pm.log 00:10:17.082 1035656 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:17.097 [Pipeline] sh 00:10:17.382 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:17.383 ++ grep -v 'sudo pgrep' 00:10:17.383 ++ awk '{print $1}' 00:10:17.383 + sudo kill -9 00:10:17.383 + true 00:10:17.396 [Pipeline] sh 00:10:17.681 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:17.681 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:17.681 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:18.631 [Pipeline] sh 00:10:18.916 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:18.916 Artifacts sizes are good 00:10:18.933 [Pipeline] archiveArtifacts 00:10:18.942 Archiving artifacts 00:10:18.999 [Pipeline] sh 00:10:19.289 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:19.307 [Pipeline] cleanWs 00:10:19.319 [WS-CLEANUP] Deleting project workspace... 00:10:19.319 [WS-CLEANUP] Deferred wipeout is used... 00:10:19.325 [WS-CLEANUP] done 00:10:19.327 [Pipeline] } 00:10:19.346 [Pipeline] // catchError 00:10:19.362 [Pipeline] sh 00:10:19.645 + logger -p user.info -t JENKINS-CI 00:10:19.653 [Pipeline] } 00:10:19.670 [Pipeline] // stage 00:10:19.676 [Pipeline] } 00:10:19.693 [Pipeline] // node 00:10:19.701 [Pipeline] End of Pipeline 00:10:19.752 Finished: SUCCESS